Precision Attitude Determination System (PADS) design and analysis. Two-axis gimbal star tracker
NASA Technical Reports Server (NTRS)
1973-01-01
Development of the Precision Attitude Determination System (PADS) focused chiefly on the two-axis gimballed star tracker and electronics design improved from that of Precision Pointing Control System (PPCS), and application of the improved tracker for PADS at geosynchronous altitude. System design, system analysis, software design, and hardware design activities are reported. The system design encompasses the PADS configuration, system performance characteristics, component design summaries, and interface considerations. The PADS design and performance analysis includes error analysis, performance analysis via attitude determination simulation, and star tracker servo design analysis. The design of the star tracker and electronics are discussed. Sensor electronics schematics are included. A detailed characterization of the application software algorithms and computer requirements is provided.
Precise comparisons of bottom-pressure and altimetric ocean tides
NASA Astrophysics Data System (ADS)
Ray, R. D.
2013-09-01
A new set of pelagic tide determinations is constructed from seafloor pressure measurements obtained at 151 sites in the deep ocean. To maximize precision of estimated tides, only stations with long time series are used; median time series length is 567 days. Geographical coverage is considerably improved by use of the international tsunami network, but coverage in the Indian Ocean and South Pacific is still weak. As a tool for assessing global ocean tide models, the data set is considerably more reliable than older data sets: the root-mean-square difference with a recent altimetric tide model is approximately 5 mm for the M2 constituent. Precision is sufficiently high to allow secondary effects in altimetric and bottom-pressure tide differences to be studied. The atmospheric tide in bottom pressure is clearly detected at the S1, S2, and T2 frequencies. The altimetric tide model is improved if satellite altimetry is corrected for crustal loading by the atmospheric tide. Models of the solid body tide can also be constrained. The free core-nutation effect in the K1 Love number is easily detected, but the overall estimates are not as accurate as a recent determination with very long baseline interferometry.
Precise Comparisons of Bottom-Pressure and Altimetric Ocean Tides
NASA Technical Reports Server (NTRS)
Ray, Richard D.
2013-01-01
A new set of pelagic tide determinations is constructed from seafloor pressure measurements obtained at 151 sites in the deep ocean. To maximize precision of estimated tides, only stations with long time series are used; median time series length is 567 days. Geographical coverage is considerably improved by use of the international tsunami network, but coverage in the Indian Ocean and South Pacific is still weak. As a tool for assessing global ocean tide models, the data set is considerably more reliable than older data sets : the root-mean-square difference with a recent altimetric tide model is approximately 5 mm for the M2 constituent. Precision is sufficiently high to allow secondary effects in altimetric and bottom-pressure tide differences to be studied. The atmospheric tide in bottom pressure is clearly detected at the S1, S2, and T2 frequencies. The altimetric tide model is improved if satellite altimetry is corrected for crustal loading by the atmospheric tide. Models of the solid body tide can also be constrained. The free corenutation effect in the K1 Love number is easily detected, but the overall estimates are not as accurate as a recent determination with very long baseline interferometry.
Considerations for applying digital soil mapping to ecological sites
USDA-ARS?s Scientific Manuscript database
Recent advancements in the spatial prediction of soil properties are not currently being fully utilized for ecological studies. Linking digital soil mapping (DSM) with ecological sites (ES) has the potential to better land management decisions by improving spatial resolution and precision as well as...
Design Considerations for an Integrated Solar Sail Diagnostics System
NASA Technical Reports Server (NTRS)
Jenkins, Christopher H. M.; Gough, Aaron R.; Pappa, Richard S.; Carroll, Joe; Blandino, Joseph R.; Miles, Jonathan J.; Rakoczy, John
2004-01-01
Efforts are continuing under NASA support to improve the readiness level of solar sail technology. Solar sails have one of the best chances to be the next gossamer spacecraft flown in space. In the gossamer spacecraft community thus far, solar sails have always been considered a "low precision" application compared with, say, radar or optical devices. However, as this paper shows, even low precision gossamer applications put extraordinary demands on structural measurement systems if they are to be traceable to use in space.
Huang, Yang; Lowe, Henry J.; Hersh, William R.
2003-01-01
Objective: Despite the advantages of structured data entry, much of the patient record is still stored as unstructured or semistructured narrative text. The issue of representing clinical document content remains problematic. The authors' prior work using an automated UMLS document indexing system has been encouraging but has been affected by the generally low indexing precision of such systems. In an effort to improve precision, the authors have developed a context-sensitive document indexing model to calculate the optimal subset of UMLS source vocabularies used to index each document section. This pilot study was performed to evaluate the utility of this indexing approach on a set of clinical radiology reports. Design: A set of clinical radiology reports that had been indexed manually using UMLS concept descriptors was indexed automatically by the SAPHIRE indexing engine. Using the data generated by this process the authors developed a system that simulated indexing, at the document section level, of the same document set using many permutations of a subset of the UMLS constituent vocabularies. Measurements: The precision and recall scores generated by simulated indexing for each permutation of two or three UMLS constituent vocabularies were determined. Results: While there was considerable variation in precision and recall values across the different subtypes of radiology reports, the overall effect of this indexing strategy using the best combination of two or three UMLS constituent vocabularies was an improvement in precision without significant impact of recall. Conclusion: In this pilot study a contextual indexing strategy improved overall precision in a set of clinical radiology reports. PMID:12925544
Huang, Yang; Lowe, Henry J; Hersh, William R
2003-01-01
Despite the advantages of structured data entry, much of the patient record is still stored as unstructured or semistructured narrative text. The issue of representing clinical document content remains problematic. The authors' prior work using an automated UMLS document indexing system has been encouraging but has been affected by the generally low indexing precision of such systems. In an effort to improve precision, the authors have developed a context-sensitive document indexing model to calculate the optimal subset of UMLS source vocabularies used to index each document section. This pilot study was performed to evaluate the utility of this indexing approach on a set of clinical radiology reports. A set of clinical radiology reports that had been indexed manually using UMLS concept descriptors was indexed automatically by the SAPHIRE indexing engine. Using the data generated by this process the authors developed a system that simulated indexing, at the document section level, of the same document set using many permutations of a subset of the UMLS constituent vocabularies. The precision and recall scores generated by simulated indexing for each permutation of two or three UMLS constituent vocabularies were determined. While there was considerable variation in precision and recall values across the different subtypes of radiology reports, the overall effect of this indexing strategy using the best combination of two or three UMLS constituent vocabularies was an improvement in precision without significant impact of recall. In this pilot study a contextual indexing strategy improved overall precision in a set of clinical radiology reports.
Individual Differences in Neural Regions Functionally Related to Real and Imagined Stuttering
ERIC Educational Resources Information Center
Wymbs, Nicholas F.; Ingham, Roger J.; Ingham, Janis C.; Paolini, Katherine E.; Grafton, Scott T.
2013-01-01
Recent brain imaging investigations of developmental stuttering show considerable disagreement regarding which regions are related to stuttering. These divergent findings have been mainly derived from group studies. To investigate functional neurophysiology with improved precision, an individual-participant approach (N = 4) using event-related…
Expertise for upright faces improves the precision but not the capacity of visual working memory.
Lorenc, Elizabeth S; Pratte, Michael S; Angeloni, Christopher F; Tong, Frank
2014-10-01
Considerable research has focused on how basic visual features are maintained in working memory, but little is currently known about the precision or capacity of visual working memory for complex objects. How precisely can an object be remembered, and to what extent might familiarity or perceptual expertise contribute to working memory performance? To address these questions, we developed a set of computer-generated face stimuli that varied continuously along the dimensions of age and gender, and we probed participants' memories using a method-of-adjustment reporting procedure. This paradigm allowed us to separately estimate the precision and capacity of working memory for individual faces, on the basis of the assumptions of a discrete capacity model, and to assess the impact of face inversion on memory performance. We found that observers could maintain up to four to five items on average, with equally good memory capacity for upright and upside-down faces. In contrast, memory precision was significantly impaired by face inversion at every set size tested. Our results demonstrate that the precision of visual working memory for a complex stimulus is not strictly fixed but, instead, can be modified by learning and experience. We find that perceptual expertise for upright faces leads to significant improvements in visual precision, without modifying the capacity of working memory.
Emanuel Miller Lecture: Early Onset Depressions--Meanings, Mechanisms and Processes
ERIC Educational Resources Information Center
Goodyer, Ian M.
2008-01-01
Background: Depressive syndromes in children and adolescents constitute a serious group of mental disorders with considerable risk for recurrence. A more precise understanding of aetiology is necessary to improve treatment and management. Methods: Three neuroactive agents are purported to be involved in the aetiology of these disorders: serotonin,…
DOSESCREEN: a computer program to aid dose placement
Kimberly C. Smith; Jacqueline L. Robertson
1984-01-01
Careful selection of an experimental design for a bioassay substantially improves the precision of effective dose (ED) estimates. Design considerations typically include determination of sample size, dose selection, and allocation of subjects to doses. DOSESCREEN is a computer program written to help investigators select an efficient design for the estimation of an...
Łabaj, Paweł P; Leparc, Germán G; Linggi, Bryan E; Markillie, Lye Meng; Wiley, H Steven; Kreil, David P
2011-07-01
Measurement precision determines the power of any analysis to reliably identify significant signals, such as in screens for differential expression, independent of whether the experimental design incorporates replicates or not. With the compilation of large-scale RNA-Seq datasets with technical replicate samples, however, we can now, for the first time, perform a systematic analysis of the precision of expression level estimates from massively parallel sequencing technology. This then allows considerations for its improvement by computational or experimental means. We report on a comprehensive study of target identification and measurement precision, including their dependence on transcript expression levels, read depth and other parameters. In particular, an impressive recall of 84% of the estimated true transcript population could be achieved with 331 million 50 bp reads, with diminishing returns from longer read lengths and even less gains from increased sequencing depths. Most of the measurement power (75%) is spent on only 7% of the known transcriptome, however, making less strongly expressed transcripts harder to measure. Consequently, <30% of all transcripts could be quantified reliably with a relative error<20%. Based on established tools, we then introduce a new approach for mapping and analysing sequencing reads that yields substantially improved performance in gene expression profiling, increasing the number of transcripts that can reliably be quantified to over 40%. Extrapolations to higher sequencing depths highlight the need for efficient complementary steps. In discussion we outline possible experimental and computational strategies for further improvements in quantification precision. rnaseq10@boku.ac.at
NASA Astrophysics Data System (ADS)
Ren, Xia; Yang, Yuanxi; Zhu, Jun; Xu, Tianhe
2017-11-01
Intersatellite Link (ISL) technology helps to realize the auto update of broadcast ephemeris and clock error parameters for Global Navigation Satellite System (GNSS). ISL constitutes an important approach with which to both improve the observation geometry and extend the tracking coverage of China's Beidou Navigation Satellite System (BDS). However, ISL-only orbit determination might lead to the constellation drift, rotation, and even lead to the divergence in orbit determination. Fortunately, predicted orbits with good precision can be used as a priori information with which to constrain the estimated satellite orbit parameters. Therefore, the precision of satellite autonomous orbit determination can be improved by consideration of a priori orbit information, and vice versa. However, the errors of rotation and translation in a priori orbit will remain in the ultimate result. This paper proposes a constrained precise orbit determination (POD) method for a sub-constellation of the new Beidou satellite constellation with only a few ISLs. The observation model of dual one-way measurements eliminating satellite clock errors is presented, and the orbit determination precision is analyzed with different data processing backgrounds. The conclusions are as follows. (1) With ISLs, the estimated parameters are strongly correlated, especially the positions and velocities of satellites. (2) The performance of determined BDS orbits will be improved by the constraints with more precise priori orbits. The POD precision is better than 45 m with a priori orbit constrain of 100 m precision (e.g., predicted orbits by telemetry tracking and control system), and is better than 6 m with precise priori orbit constraints of 10 m precision (e.g., predicted orbits by international GNSS monitoring & Assessment System (iGMAS)). (3) The POD precision is improved by additional ISLs. Constrained by a priori iGMAS orbits, the POD precision with two, three, and four ISLs is better than 6, 3, and 2 m, respectively. (4) The in-plane link and out-of-plane link have different contributions to observation configuration and system observability. The POD with weak observation configuration (e.g., one in-plane link and one out-of-plane link) should be tightly constrained with a priori orbits.
Optics measurement algorithms and error analysis for the proton energy frontier
NASA Astrophysics Data System (ADS)
Langner, A.; Tomás, R.
2015-03-01
Optics measurement algorithms have been improved in preparation for the commissioning of the LHC at higher energy, i.e., with an increased damage potential. Due to machine protection considerations the higher energy sets tighter limits in the maximum excitation amplitude and the total beam charge, reducing the signal to noise ratio of optics measurements. Furthermore the precision in 2012 (4 TeV) was insufficient to understand beam size measurements and determine interaction point (IP) β -functions (β*). A new, more sophisticated algorithm has been developed which takes into account both the statistical and systematic errors involved in this measurement. This makes it possible to combine more beam position monitor measurements for deriving the optical parameters and demonstrates to significantly improve the accuracy and precision. Measurements from the 2012 run have been reanalyzed which, due to the improved algorithms, result in a significantly higher precision of the derived optical parameters and decreased the average error bars by a factor of three to four. This allowed the calculation of β* values and demonstrated to be fundamental in the understanding of emittance evolution during the energy ramp.
Pan, Shuguo; Chen, Weirong; Jin, Xiaodong; Shi, Xiaofei; He, Fan
2015-07-22
Satellite orbit error and clock bias are the keys to precise point positioning (PPP). The traditional PPP algorithm requires precise satellite products based on worldwide permanent reference stations. Such an algorithm requires considerable work and hardly achieves real-time performance. However, real-time positioning service will be the dominant mode in the future. IGS is providing such an operational service (RTS) and there are also commercial systems like Trimble RTX in operation. On the basis of the regional Continuous Operational Reference System (CORS), a real-time PPP algorithm is proposed to apply the coupling estimation of clock bias and orbit error. The projection of orbit error onto the satellite-receiver range has the same effects on positioning accuracy with clock bias. Therefore, in satellite clock estimation, part of the orbit error can be absorbed by the clock bias and the effects of residual orbit error on positioning accuracy can be weakened by the evenly distributed satellite geometry. In consideration of the simple structure of pseudorange equations and the high precision of carrier-phase equations, the clock bias estimation method coupled with orbit error is also improved. Rovers obtain PPP results by receiving broadcast ephemeris and real-time satellite clock bias coupled with orbit error. By applying the proposed algorithm, the precise orbit products provided by GNSS analysis centers are rendered no longer necessary. On the basis of previous theoretical analysis, a real-time PPP system was developed. Some experiments were then designed to verify this algorithm. Experimental results show that the newly proposed approach performs better than the traditional PPP based on International GNSS Service (IGS) real-time products. The positioning accuracies of the rovers inside and outside the network are improved by 38.8% and 36.1%, respectively. The PPP convergence speeds are improved by up to 61.4% and 65.9%. The new approach can change the traditional PPP mode because of its advantages of independence, high positioning precision, and real-time performance. It could be an alternative solution for regional positioning service before global PPP service comes into operation.
Pan, Shuguo; Chen, Weirong; Jin, Xiaodong; Shi, Xiaofei; He, Fan
2015-01-01
Satellite orbit error and clock bias are the keys to precise point positioning (PPP). The traditional PPP algorithm requires precise satellite products based on worldwide permanent reference stations. Such an algorithm requires considerable work and hardly achieves real-time performance. However, real-time positioning service will be the dominant mode in the future. IGS is providing such an operational service (RTS) and there are also commercial systems like Trimble RTX in operation. On the basis of the regional Continuous Operational Reference System (CORS), a real-time PPP algorithm is proposed to apply the coupling estimation of clock bias and orbit error. The projection of orbit error onto the satellite-receiver range has the same effects on positioning accuracy with clock bias. Therefore, in satellite clock estimation, part of the orbit error can be absorbed by the clock bias and the effects of residual orbit error on positioning accuracy can be weakened by the evenly distributed satellite geometry. In consideration of the simple structure of pseudorange equations and the high precision of carrier-phase equations, the clock bias estimation method coupled with orbit error is also improved. Rovers obtain PPP results by receiving broadcast ephemeris and real-time satellite clock bias coupled with orbit error. By applying the proposed algorithm, the precise orbit products provided by GNSS analysis centers are rendered no longer necessary. On the basis of previous theoretical analysis, a real-time PPP system was developed. Some experiments were then designed to verify this algorithm. Experimental results show that the newly proposed approach performs better than the traditional PPP based on International GNSS Service (IGS) real-time products. The positioning accuracies of the rovers inside and outside the network are improved by 38.8% and 36.1%, respectively. The PPP convergence speeds are improved by up to 61.4% and 65.9%. The new approach can change the traditional PPP mode because of its advantages of independence, high positioning precision, and real-time performance. It could be an alternative solution for regional positioning service before global PPP service comes into operation. PMID:26205276
Spike timing precision of neuronal circuits.
Kilinc, Deniz; Demir, Alper
2018-06-01
Spike timing is believed to be a key factor in sensory information encoding and computations performed by the neurons and neuronal circuits. However, the considerable noise and variability, arising from the inherently stochastic mechanisms that exist in the neurons and the synapses, degrade spike timing precision. Computational modeling can help decipher the mechanisms utilized by the neuronal circuits in order to regulate timing precision. In this paper, we utilize semi-analytical techniques, which were adapted from previously developed methods for electronic circuits, for the stochastic characterization of neuronal circuits. These techniques, which are orders of magnitude faster than traditional Monte Carlo type simulations, can be used to directly compute the spike timing jitter variance, power spectral densities, correlation functions, and other stochastic characterizations of neuronal circuit operation. We consider three distinct neuronal circuit motifs: Feedback inhibition, synaptic integration, and synaptic coupling. First, we show that both the spike timing precision and the energy efficiency of a spiking neuron are improved with feedback inhibition. We unveil the underlying mechanism through which this is achieved. Then, we demonstrate that a neuron can improve on the timing precision of its synaptic inputs, coming from multiple sources, via synaptic integration: The phase of the output spikes of the integrator neuron has the same variance as that of the sample average of the phases of its inputs. Finally, we reveal that weak synaptic coupling among neurons, in a fully connected network, enables them to behave like a single neuron with a larger membrane area, resulting in an improvement in the timing precision through cooperation.
The effects of selective and divided attention on sensory precision and integration.
Odegaard, Brian; Wozny, David R; Shams, Ladan
2016-02-12
In our daily lives, our capacity to selectively attend to stimuli within or across sensory modalities enables enhanced perception of the surrounding world. While previous research on selective attention has studied this phenomenon extensively, two important questions still remain unanswered: (1) how selective attention to a single modality impacts sensory integration processes, and (2) the mechanism by which selective attention improves perception. We explored how selective attention impacts performance in both a spatial task and a temporal numerosity judgment task, and employed a Bayesian Causal Inference model to investigate the computational mechanism(s) impacted by selective attention. We report three findings: (1) in the spatial domain, selective attention improves precision of the visual sensory representations (which were relatively precise), but not the auditory sensory representations (which were fairly noisy); (2) in the temporal domain, selective attention improves the sensory precision in both modalities (both of which were fairly reliable to begin with); (3) in both tasks, selective attention did not exert a significant influence over the tendency to integrate sensory stimuli. Therefore, it may be postulated that a sensory modality must possess a certain inherent degree of encoding precision in order to benefit from selective attention. It also appears that in certain basic perceptual tasks, the tendency to integrate crossmodal signals does not depend significantly on selective attention. We conclude with a discussion of how these results relate to recent theoretical considerations of selective attention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wray, J. D.
2003-05-01
The robotic observatory telescope must point precisely on the target object, and then track autonomously to a fraction of the FWHM of the system PSF for durations of ten to twenty minutes or more. It must retain this precision while continuing to function at rates approaching thousands of observations per night for all its years of useful life. These stringent requirements raise new challenges unique to robotic telescope systems design. Critical design considerations are driven by the applicability of the above requirements to all systems of the robotic observatory, including telescope and instrument systems, telescope-dome enclosure systems, combined electrical and electronics systems, environmental (e.g. seeing) control systems and integrated computer control software systems. Traditional telescope design considerations include the effects of differential thermal strain, elastic flexure, plastic flexure and slack or backlash with respect to focal stability, optical alignment and angular pointing and tracking precision. Robotic observatory design must holistically encapsulate these traditional considerations within the overall objective of maximized long-term sustainable precision performance. This overall objective is accomplished through combining appropriate mechanical and dynamical system characteristics with a full-time real-time telescope mount model feedback computer control system. Important design considerations include: identifying and reducing quasi-zero-backlash; increasing size to increase precision; directly encoding axis shaft rotation; pointing and tracking operation via real-time feedback between precision mount model and axis mounted encoders; use of monolithic construction whenever appropriate for sustainable mechanical integrity; accelerating dome motion to eliminate repetitive shock; ducting internal telescope air to outside dome; and the principal design criteria: maximizing elastic repeatability while minimizing slack, plastic deformation and hysteresis to facilitate long-term repeatably precise pointing and tracking performance.
rpe v5: an emulator for reduced floating-point precision in large numerical simulations
NASA Astrophysics Data System (ADS)
Dawson, Andrew; Düben, Peter D.
2017-06-01
This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.
Hung, Man; Nickisch, Florian; Beals, Timothy C; Greene, Tom; Clegg, Daniel O; Saltzman, Charles L
2012-08-01
Accurately measuring, reporting and comparing outcomes is essential for improving health care delivery. Current challenges with available health status scales include patient fatigue, floor/ceiling effects and validity/reliability. This study compared Patient Reported Outcomes Measurement Information System (PROMIS)-based Lower Extremity Physical Function Computerized Adaptive Test (LE CAT) and two legacy scales -the Foot and Function Index (FFI) and the sport module from the Foot and Ankle Ability Measure (spFAAM) -for 287 patients scheduled for elective foot and ankle surgery. We documented the time required by patients to complete the instrument, instrument precision, and the extent to which each instrument covered the full range of physical functioning across the patient sample. Average time of test administration: 66 seconds for LE CAT, 130 seconds for spFAAM and 239 seconds for FFI. All three instruments were fairly precise at intermediate physical functioning levels (i.e., Standard Error of Measurement < 0.35), were relatively less precise at the higher trait levels and the LE CAT maintained precision in the lower range while the spFAAM and FFI's had decreased precision. The LE CAT had less floor/ceiling effects than the FFI and the spFAAM. The LE CAT showed considerable advantage compared to legacy scales for measuring patient-reported outcomes in orthopaedic patients with foot and ankle problems. A paradigm shift to broader use of PROMIS-based CATs should be considered to improve precision and reduce patient burden with patient-reported outcome measuremen foot and ankle patients.
Utilizing the N beam position monitor method for turn-by-turn optics measurements
NASA Astrophysics Data System (ADS)
Langner, A.; Benedetti, G.; Carlà, M.; Iriso, U.; Martí, Z.; de Portugal, J. Coello; Tomás, R.
2016-09-01
The N beam position monitor method (N -BPM) which was recently developed for the LHC has significantly improved the precision of optics measurements that are based on BPM turn-by-turn data. The main improvement is due to the consideration of correlations for statistical and systematic error sources, as well as increasing the amount of BPM combinations which are used to derive the β -function at one location. We present how this technique can be applied at light sources like ALBA, and compare the results with other methods.
NASA Astrophysics Data System (ADS)
Park, E.; Jeong, J.
2017-12-01
A precise estimation of groundwater fluctuation is studied by considering delayed recharge flux (DRF) and unsaturated zone drainage (UZD). Both DRF and UZD are due to gravitational flow impeded in the unsaturated zone, which may nonnegligibly affect groundwater level changes. In the validation, a previous model without the consideration of unsaturated flow is benchmarked where the actual groundwater level and precipitation data are divided into three periods based on the climatic condition. The estimation capability of the new model is superior to the benchmarked model as indicated by the significantly improved representation of groundwater level with physically interpretable model parameters.
Precision pointing and control of flexible spacecraft
NASA Technical Reports Server (NTRS)
Bantell, M. H., Jr.
1987-01-01
The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.
Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G
2016-05-01
With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice. Copyright © 2015 Elsevier Ltd. All rights reserved.
Improving face image extraction by using deep learning technique
NASA Astrophysics Data System (ADS)
Xue, Zhiyun; Antani, Sameer; Long, L. R.; Demner-Fushman, Dina; Thoma, George R.
2016-03-01
The National Library of Medicine (NLM) has made a collection of over a 1.2 million research articles containing 3.2 million figure images searchable using the Open-iSM multimodal (text+image) search engine. Many images are visible light photographs, some of which are images containing faces ("face images"). Some of these face images are acquired in unconstrained settings, while others are studio photos. To extract the face regions in the images, we first applied one of the most widely-used face detectors, a pre-trained Viola-Jones detector implemented in Matlab and OpenCV. The Viola-Jones detector was trained for unconstrained face image detection, but the results for the NLM database included many false positives, which resulted in a very low precision. To improve this performance, we applied a deep learning technique, which reduced the number of false positives and as a result, the detection precision was improved significantly. (For example, the classification accuracy for identifying whether the face regions output by this Viola- Jones detector are true positives or not in a test set is about 96%.) By combining these two techniques (Viola-Jones and deep learning) we were able to increase the system precision considerably, while avoiding the need to manually construct a large training set by manual delineation of the face regions.
F-MAP: A Bayesian approach to infer the gene regulatory network using external hints
Shahdoust, Maryam; Mahjub, Hossein; Sadeghi, Mehdi
2017-01-01
The Common topological features of related species gene regulatory networks suggest reconstruction of the network of one species by using the further information from gene expressions profile of related species. We present an algorithm to reconstruct the gene regulatory network named; F-MAP, which applies the knowledge about gene interactions from related species. Our algorithm sets a Bayesian framework to estimate the precision matrix of one species microarray gene expressions dataset to infer the Gaussian Graphical model of the network. The conjugate Wishart prior is used and the information from related species is applied to estimate the hyperparameters of the prior distribution by using the factor analysis. Applying the proposed algorithm on six related species of drosophila shows that the precision of reconstructed networks is improved considerably compared to the precision of networks constructed by other Bayesian approaches. PMID:28938012
Precision wildlife monitoring using unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Hodgson, Jarrod C.; Baylis, Shane M.; Mott, Rowan; Herrod, Ashley; Clarke, Rohan H.
2016-03-01
Unmanned aerial vehicles (UAVs) represent a new frontier in environmental research. Their use has the potential to revolutionise the field if they prove capable of improving data quality or the ease with which data are collected beyond traditional methods. We apply UAV technology to wildlife monitoring in tropical and polar environments and demonstrate that UAV-derived counts of colony nesting birds are an order of magnitude more precise than traditional ground counts. The increased count precision afforded by UAVs, along with their ability to survey hard-to-reach populations and places, will likely drive many wildlife monitoring projects that rely on population counts to transition from traditional methods to UAV technology. Careful consideration will be required to ensure the coherence of historic data sets with new UAV-derived data and we propose a method for determining the number of duplicated (concurrent UAV and ground counts) sampling points needed to achieve data compatibility.
Geppert, H; Denkmayr, T; Sponar, S; Lemmel, H; Hasegawa, Y
2014-11-01
For precise measurements with polarised neutrons high efficient spin-manipulation is required. We developed several neutron optical elements suitable for a new sophisticated setup, i.e., DC spin-turners and Larmor-accelerators which diminish thermal disturbances and depolarisation considerably. The gain in performance is exploited demonstrating violation of a Bell-like inequality for a spin-path entangled single-neutron state. The obtained value of [Formula: see text], which is much higher than previous measurements by neutron interferometry, is [Formula: see text] above the limit of S =2 predicted by contextual hidden variable theories. The new setup is more flexible referring to state preparation and analysis, therefore new, more precise measurements can be carried out.
Toward the use of precision medicine for the treatment of head and neck squamous cell carcinoma.
Gong, Wang; Xiao, Yandi; Wei, Zihao; Yuan, Yao; Qiu, Min; Sun, Chongkui; Zeng, Xin; Liang, Xinhua; Feng, Mingye; Chen, Qianming
2017-01-10
Precision medicine is a new strategy that aims at preventing and treating human diseases by focusing on individual variations in people's genes, environment and lifestyle. Precision medicine has been used for cancer diagnosis and treatment and shows evident clinical efficacy. Rapid developments in molecular biology, genetics and sequencing technologies, as well as computational technology, has enabled the establishment of "big data", such as the Human Genome Project, which provides a basis for precision medicine. Head and neck squamous cell carcinoma (HNSCC) is an aggressive cancer with a high incidence rate and low survival rate. Current therapies are often aggressive and carry considerable side effects. Much research now indicates that precision medicine can be used for HNSCC and may achieve improved results. From this perspective, we present an overview of the current status, potential strategies, and challenges of precision medicine in HNSCC. We focus on targeted therapy based on cell the surface signaling receptors epidermal growth factor receptor (EGFR), vascular endothelial growth factor (VEGF) and human epidermal growth factor receptor-2 (HER2), and on the PI3K/AKT/mTOR, JAK/STAT3 and RAS/RAF/MEK/ERK cellular signaling pathways. Gene therapy for the treatment of HNSCC is also discussed.
Toward the use of precision medicine for the treatment of head and neck squamous cell carcinoma
Gong, Wang; Xiao, Yandi; Wei, Zihao; Yuan, Yao; Qiu, Min; Sun, Chongkui; Zeng, Xin; Liang, Xinhua; Feng, Mingye; Chen, Qianming
2017-01-01
Precision medicine is a new strategy that aims at preventing and treating human diseases by focusing on individual variations in people's genes, environment and lifestyle. Precision medicine has been used for cancer diagnosis and treatment and shows evident clinical efficacy. Rapid developments in molecular biology, genetics and sequencing technologies, as well as computational technology, has enabled the establishment of “big data”, such as the Human Genome Project, which provides a basis for precision medicine. Head and neck squamous cell carcinoma (HNSCC) is an aggressive cancer with a high incidence rate and low survival rate. Current therapies are often aggressive and carry considerable side effects. Much research now indicates that precision medicine can be used for HNSCC and may achieve improved results. From this perspective, we present an overview of the current status, potential strategies, and challenges of precision medicine in HNSCC. We focus on targeted therapy based on cell the surface signaling receptors epidermal growth factor receptor (EGFR), vascular endothelial growth factor (VEGF) and human epidermal growth factor receptor-2 (HER2), and on the PI3K/AKT/mTOR, JAK/STAT3 and RAS/RAF/MEK/ERK cellular signaling pathways. Gene therapy for the treatment of HNSCC is also discussed. PMID:27924064
Inverse probability weighting for covariate adjustment in randomized studies
Li, Xiaochun; Li, Lingling
2013-01-01
SUMMARY Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting “favorable” model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a “favorable” model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. PMID:24038458
Inverse probability weighting for covariate adjustment in randomized studies.
Shen, Changyu; Li, Xiaochun; Li, Lingling
2014-02-20
Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.
A flat spectral Faraday filter for sodium lidar.
Yang, Yong; Cheng, Xuewu; Li, Faquan; Hu, Xiong; Lin, Xin; Gong, Shunsheng
2011-04-01
We report a flat spectral Faraday anomalous dispersion optical filter (FS-FADOF) for sodium lidar. The physical and technical considerations for obtaining a FS-FADOF with a 3.5 GHz flat spectral transmission function are presented. It was found that the effective transmission of this filter was much higher (>94%) and more uniform than that of the ultranarrowband FADOF, and therefore were less sensitive to laser-frequency drift. Thus, the FS-FADOF can improve lidar efficiency and precision.
Powered Descent Trajectory Guidance and Some Considerations for Human Lunar Landing
NASA Technical Reports Server (NTRS)
Sostaric, Ronald R.
2007-01-01
The Autonomous Precision Landing and Hazard Detection and Avoidance Technology development (ALHAT) will enable an accurate (better than 100m) landing on the lunar surface. This technology will also permit autonomous (independent from ground) avoidance of hazards detected in real time. A preliminary trajectory guidance algorithm capable of supporting these tasks has been developed and demonstrated in simulations. Early results suggest that with expected improvements in sensor technology and lunar mapping, mission objectives are achievable.
A supervised learning rule for classification of spatiotemporal spike patterns.
Lilin Guo; Zhenzhong Wang; Adjouadi, Malek
2016-08-01
This study introduces a novel supervised algorithm for spiking neurons that take into consideration synapse delays and axonal delays associated with weights. It can be utilized for both classification and association and uses several biologically influenced properties, such as axonal and synaptic delays. This algorithm also takes into consideration spike-timing-dependent plasticity as in Remote Supervised Method (ReSuMe). This paper focuses on the classification aspect alone. Spiked neurons trained according to this proposed learning rule are capable of classifying different categories by the associated sequences of precisely timed spikes. Simulation results have shown that the proposed learning method greatly improves classification accuracy when compared to the Spike Pattern Association Neuron (SPAN) and the Tempotron learning rule.
The Qatar genome: a population-specific tool for precision medicine in the Middle East
Fakhro, Khalid A; Staudt, Michelle R; Ramstetter, Monica Denise; Robay, Amal; Malek, Joel A; Badii, Ramin; Al-Marri, Ajayeb Al-Nabet; Khalil, Charbel Abi; Al-Shakaki, Alya; Chidiac, Omar; Stadler, Dora; Zirie, Mahmoud; Jayyousi, Amin; Salit, Jacqueline; Mezey, Jason G; Crystal, Ronald G; Rodriguez-Flores, Juan L
2016-01-01
Reaching the full potential of precision medicine depends on the quality of personalized genome interpretation. In order to facilitate precision medicine in regions of the Middle East and North Africa (MENA), a population-specific genome for the indigenous Arab population of Qatar (QTRG) was constructed by incorporating allele frequency data from sequencing of 1,161 Qataris, representing 0.4% of the population. A total of 20.9 million single nucleotide polymorphisms (SNPs) and 3.1 million indels were observed in Qatar, including an average of 1.79% novel variants per individual genome. Replacement of the GRCh37 standard reference with QTRG in a best practices genome analysis workflow resulted in an average of 7* deeper coverage depth (an improvement of 23%) and 756,671 fewer variants on average, a reduction of 16% that is attributed to common Qatari alleles being present in QTRG. The benefit for using QTRG varies across ancestries, a factor that should be taken into consideration when selecting an appropriate reference for analysis. PMID:27408750
Deng, Xi; Schröder, Simone; Redweik, Sabine; Wätzig, Hermann
2011-06-01
Gel electrophoresis (GE) is a very common analytical technique for proteome research and protein analysis. Despite being developed decades ago, there is still a considerable need to improve its precision. Using the fluorescence of Colloidal Coomassie Blue -stained proteins in near-infrared (NIR), the major error source caused by the unpredictable background staining is strongly reduced. This result was generalized for various types of detectors. Since GE is a multi-step procedure, standardization of every single step is required. After detailed analysis of all steps, the staining and destaining were identified as the major source of the remaining variation. By employing standardized protocols, pooled percent relative standard deviations of 1.2-3.1% for band intensities were achieved for one-dimensional separations in repetitive experiments. The analysis of variance suggests that the same batch of staining solution should be used for gels of one experimental series to minimize day-to-day variation and to obtain high precision. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Estimation of suspended-sediment rating curves and mean suspended-sediment loads
Crawford, Charles G.
1991-01-01
A simulation study was done to evaluate: (1) the accuracy and precision of parameter estimates for the bias-corrected, transformed-linear and non-linear models obtained by the method of least squares; (2) the accuracy of mean suspended-sediment loads calculated by the flow-duration, rating-curve method using model parameters obtained by the alternative methods. Parameter estimates obtained by least squares for the bias-corrected, transformed-linear model were considerably more precise than those obtained for the non-linear or weighted non-linear model. The accuracy of parameter estimates obtained for the biascorrected, transformed-linear and weighted non-linear model was similar and was much greater than the accuracy obtained by non-linear least squares. The improved parameter estimates obtained by the biascorrected, transformed-linear or weighted non-linear model yield estimates of mean suspended-sediment load calculated by the flow-duration, rating-curve method that are more accurate and precise than those obtained for the non-linear model.
Image Guidance in Radiation Therapy: Techniques and Applications
Kataria, Tejinder
2014-01-01
In modern day radiotherapy, the emphasis on reduction on volume exposed to high radiotherapy doses, improving treatment precision as well as reducing radiation-related normal tissue toxicity has increased, and thus there is greater importance given to accurate position verification and correction before delivering radiotherapy. At present, several techniques that accomplish these goals impeccably have been developed, though all of them have their limitations. There is no single method available that eliminates treatment-related uncertainties without considerably adding to the cost. However, delivering “high precision radiotherapy” without periodic image guidance would do more harm than treating large volumes to compensate for setup errors. In the present review, we discuss the concept of image guidance in radiotherapy, the current techniques available, and their expected benefits and pitfalls. PMID:25587445
Ethical considerations of neuro-oncology trial design in the era of precision medicine.
Gupta, Saksham; Smith, Timothy R; Broekman, Marike L
2017-08-01
The field of oncology is currently undergoing a paradigm shift. Advances in the understanding of tumor biology and in tumor sequencing technology have contributed to the shift towards precision medicine, the therapeutic framework of targeting the individual oncogenic changes each tumor harbors. The success of precision medicine therapies, such as targeted kinase inhibitors and immunotherapies, in other cancers have motivated studies in brain cancers. The high specificity and cost of these therapies also encourage a shift in clinical trial design away from randomized control trials towards smaller, more exclusive early phase clinical trials. While these new trials advance the clinical application of increasingly precise and individualized therapies, their design brings ethical challenges . We review the pertinent ethical considerations for clinical trials of precision medicine in neuro-oncology and discuss methods to protect patients in this new era of trial design.
[Early onset scoliosis. What are the options?].
Farrington, D M; Tatay-Díaz, A
2013-01-01
The prognosis of children with progressive early onset scoliosis has improved considerably due to recent advances in surgical and non-surgical techniques and the understanding of the importance of preserving the thoracic space. Improvements in existing techniques and development of new methods have considerably improved the management of this condition. Derotational casting can be considered in children with documented progression of a <60° curve without previous surgical treatment. Both single and dual growing rods are effective, but the latter seem to offer better results. Hybrid constructs may be a better option in children who require a low-profile proximal anchor. The vertical expandable prosthetic titanium rib (VEPTR(®)) appears to be beneficial for patients with congenital scoliosis and fused ribs, and thoracic Insufficiency Syndrome. Children with medical comorbidities who may not tolerate repeated lengthenings should be considered for Shilla or Luque Trolley technique. Growth modulation using shape memory alloy staples or other tethers seem promising for mild curves, although more research is required to define their precise indications. Copyright © 2013 SECOT. Published by Elsevier Espana. All rights reserved.
[A plane-based hand-eye calibration method for surgical robots].
Zeng, Bowei; Meng, Fanle; Ding, Hui; Liu, Wenbo; Wu, Di; Wang, Guangzhi
2017-04-01
In order to calibrate the hand-eye transformation of the surgical robot and laser range finder (LRF), a calibration algorithm based on a planar template was designed. A mathematical model of the planar template had been given and the approach to address the equations had been derived. Aiming at the problems of the measurement error in a practical system, we proposed a new algorithm for selecting coplanar data. This algorithm can effectively eliminate considerable measurement error data to improve the calibration accuracy. Furthermore, three orthogonal planes were used to improve the calibration accuracy, in which a nonlinear optimization for hand-eye calibration was used. With the purpose of verifying the calibration precision, we used the LRF to measure some fixed points in different directions and a cuboid's surfaces. Experimental results indicated that the precision of a single planar template method was (1.37±0.24) mm, and that of the three orthogonal planes method was (0.37±0.05) mm. Moreover, the mean FRE of three-dimensional (3D) points was 0.24 mm and mean TRE was 0.26 mm. The maximum angle measurement error was 0.4 degree. Experimental results show that the method presented in this paper is effective with high accuracy and can meet the requirements of surgical robot precise location.
A Self Contained Method for Safe and Precise Lunar Landing
NASA Technical Reports Server (NTRS)
Paschall, Stephen C., II; Brady, Tye; Cohanim, Babak; Sostaric, Ronald
2008-01-01
The return of humans to the Moon will require increased capability beyond that of the previous Apollo missions. Longer stay times and a greater flexibility with regards to landing locations are among the many improvements planned. A descent and landing system that can land the vehicle more accurately than Apollo with a greater ability to detect and avoid hazards is essential to the development of a Lunar Outpost, and also for increasing the number of potentially reachable Lunar Sortie locations. This descent and landing system should allow landings in more challenging terrain and provide more flexibility with regards to mission timing and lighting considerations, while maintaining safety as the top priority. The lunar landing system under development by the ALHAT (Autonomous precision Landing and Hazard detection Avoidance Technology) project is addressing this by providing terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard-detection system to select safe landing locations, and an Autonomous GNC (Guidance, Navigation, and Control) capability to process these measurements and safely direct the vehicle to this landing location. This ALHAT landing system will enable safe and precise lunar landings without requiring lunar infrastructure in the form of navigation aids or a priori identified hazard-free landing locations. The safe landing capability provided by ALHAT uses onboard active sensing to detect hazards that are large enough to be a danger to the vehicle but too small to be detected from orbit, given currently planned orbital terrain resolution limits. Algorithms to interpret raw active sensor terrain data and generate hazard maps as well as identify safe sites and recalculate new trajectories to those sites are included as part of the ALHAT System. These improvements to descent and landing will help contribute to repeated safe and precise landings for a wide variety of terrain on the Moon.
High-speed digital signal normalization for feature identification
NASA Technical Reports Server (NTRS)
Ortiz, J. A.; Meredith, B. D.
1983-01-01
A design approach for high speed normalization of digital signals was developed. A reciprocal look up table technique is employed, where a digital value is mapped to its reciprocal via a high speed memory. This reciprocal is then multiplied with an input signal to obtain the normalized result. Normalization improves considerably the accuracy of certain feature identification algorithms. By using the concept of pipelining the multispectral sensor data processing rate is limited only by the speed of the multiplier. The breadboard system was found to operate at an execution rate of five million normalizations per second. This design features high precision, a reduced hardware complexity, high flexibility, and expandability which are very important considerations for spaceborne applications. It also accomplishes a high speed normalization rate essential for real time data processing.
Convection in Cool Stars, as Seen Through Kepler's Eyes
NASA Astrophysics Data System (ADS)
Bastien, Fabienne A.
2015-01-01
Stellar surface processes represent a fundamental limit to the detection of extrasolar planets with the currently most heavily-used techniques. As such, considerable effort has gone into trying to mitigate the impact of these processes on planet detection, with most studies focusing on magnetic spots. Meanwhile, high-precision photometric planet surveys like CoRoT and Kepler have unveiled a wide variety of stellar variability at previously inaccessible levels. We demonstrate that these newly revealed variations are not solely magnetically driven but also trace surface convection through light curve ``flicker.'' We show that ``flicker'' not only yields a simple measurement of surface gravity with a precision of ˜0.1 dex, but it may also improve our knowledge of planet properties, enhance radial velocity planet detection and discovery, and provide new insights into stellar evolution.
Liu, Dongfei; Zhang, Hongbo; Mäkilä, Ermei; Fan, Jin; Herranz-Blanco, Bárbara; Wang, Chang-Fang; Rosa, Ricardo; Ribeiro, António J; Salonen, Jarno; Hirvonen, Jouni; Santos, Hélder A
2015-01-01
An advanced nanocomposite consisting of an encapsulated porous silicon (PSi) nanoparticle and an acid-degradable acetalated dextran (AcDX) matrix (nano-in-nano), was efficiently fabricated by a one-step microfluidic self-assembly approach. The obtained nano-in-nano PSi@AcDX composites showed improved surface smoothness, homogeneous size distribution, and considerably enhanced cytocompatibility. Furthermore, multiple drugs with different physicochemical properties have been simultaneously loaded into the nanocomposites with a ratiometric control. The release kinetics of all the payloads was predominantly controlled by the decomposition rate of the outer AcDX matrix. To facilitate the intracellular drug delivery, a nona-arginine cell-penetrating peptide (CPP) was chemically conjugated onto the surface of the nanocomposites by oxime click chemistry. Taking advantage of the significantly improved cell uptake, the proliferation of two breast cancer cell lines was markedly inhibited by the CPP-functionalized multidrug-loaded nanocomposites. Overall, this nano-in-nano PSi@polymer composite prepared by the microfluidic self-assembly approach is a universal platform for nanoparticles encapsulation and precisely controlled combination chemotherapy. Copyright © 2014 Elsevier Ltd. All rights reserved.
Constraints from triple gauge couplings on vectorlike leptons
Bertuzzo, Enrico; Machado, Pedro A. N.; Perez-Gonzalez, Yuber F.; ...
2017-08-30
Here, we study the contributions of colorless vectorlike fermions to the triple gauge couplings W +W -γ and W +W -Z 0. We consider models in which their coupling to the Standard Model Higgs boson is allowed or forbidden by quantum numbers. We assess the sensitivity of the future accelerators FCC-ee, ILC, and CLIC to the parameters of these models, assuming they will be able to constrain the anomalous triple gauge couplings with precision δ κV~O(10 -4), V = γ,Z 0. We show that the combination of measurements at different center-of-mass energies helps to improve the sensitivity to the contributionmore » of vectorlike fermions, in particular when they couple to the Higgs. In fact, the measurements at the FCC-ee and, especially, the ILC and the CLIC, may turn the triple gauge couplings into a new set of precision parameters able to constrain the models better than the oblique parameters or the H → γγ decay, even assuming the considerable improvement of the latter measurements achievable at the new machines.« less
Constraints from triple gauge couplings on vectorlike leptons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertuzzo, Enrico; Machado, Pedro A. N.; Perez-Gonzalez, Yuber F.
Here, we study the contributions of colorless vectorlike fermions to the triple gauge couplings W +W -γ and W +W -Z 0. We consider models in which their coupling to the Standard Model Higgs boson is allowed or forbidden by quantum numbers. We assess the sensitivity of the future accelerators FCC-ee, ILC, and CLIC to the parameters of these models, assuming they will be able to constrain the anomalous triple gauge couplings with precision δ κV~O(10 -4), V = γ,Z 0. We show that the combination of measurements at different center-of-mass energies helps to improve the sensitivity to the contributionmore » of vectorlike fermions, in particular when they couple to the Higgs. In fact, the measurements at the FCC-ee and, especially, the ILC and the CLIC, may turn the triple gauge couplings into a new set of precision parameters able to constrain the models better than the oblique parameters or the H → γγ decay, even assuming the considerable improvement of the latter measurements achievable at the new machines.« less
Budget impact and cost-effectiveness: can we afford precision medicine in oncology?
Doble, Brett
2016-01-01
Over the past decade there have been remarkable advancements in the understanding of the molecular underpinnings of malignancy. Methods of testing capable of elucidating patients' molecular profiles are now readily available and there is an increased desire to incorporate the information derived from such tests into treatment selection for cancer patients. This has led to more appropriate application of existing treatments as well as the development of a number of innovative and highly effective treatments or what is known collectively as precision medicine. The impact that precision medicine will have on health outcomes is uncertain, as are the costs it will incur. There is, therefore, a need to develop economic evidence and appropriate methods of evaluation to support its implementation to ensure the resources allocated to these approaches are affordable and offer value for money. The market for precision medicine in oncology continues to rapidly expand, placing an increased pressure on reimbursement decision-makers to consider the value and opportunity cost of funding such approaches to care. The benefits of molecular testing can be complex and difficult to evaluate given currently available economic methods, potentially causing a distorted appreciation of their value. Funding decisions of precision medicine will also have far-reaching implications, requiring the consideration of both patient and public perspectives in decision-making. Recommendations to improve the value proposition of precision medicine are, therefore, provided with the hopes of facilitating a better understanding of its impact on outcomes and the overall health budget.
Differential absorption radar techniques: water vapor retrievals
NASA Astrophysics Data System (ADS)
Millán, Luis; Lebsock, Matthew; Livesey, Nathaniel; Tanelli, Simone
2016-06-01
Two radar pulses sent at different frequencies near the 183 GHz water vapor line can be used to determine total column water vapor and water vapor profiles (within clouds or precipitation) exploiting the differential absorption on and off the line. We assess these water vapor measurements by applying a radar instrument simulator to CloudSat pixels and then running end-to-end retrieval simulations. These end-to-end retrievals enable us to fully characterize not only the expected precision but also their potential biases, allowing us to select radar tones that maximize the water vapor signal minimizing potential errors due to spectral variations in the target extinction properties. A hypothetical CloudSat-like instrument with 500 m by ˜ 1 km vertical and horizontal resolution and a minimum detectable signal and radar precision of -30 and 0.16 dBZ, respectively, can estimate total column water vapor with an expected precision of around 0.03 cm, with potential biases smaller than 0.26 cm most of the time, even under rainy conditions. The expected precision for water vapor profiles was found to be around 89 % on average, with potential biases smaller than 77 % most of the time when the profile is being retrieved close to surface but smaller than 38 % above 3 km. By using either horizontal or vertical averaging, the precision will improve vastly, with the measurements still retaining a considerably high vertical and/or horizontal resolution.
Defining precision: The precision medicine initiative trials NCI-MPACT and NCI-MATCH.
Coyne, Geraldine O'Sullivan; Takebe, Naoko; Chen, Alice P
"Precision" trials, using rationally incorporated biomarker targets and molecularly selective anticancer agents, have become of great interest to both patients and their physicians. In the endeavor to test the cornerstone premise of precision oncotherapy, that is, determining if modulating a specific molecular aberration in a patient's tumor with a correspondingly specific therapeutic agent improves clinical outcomes, the design of clinical trials with embedded genomic characterization platforms which guide therapy are an increasing challenge. The National Cancer Institute Precision Medicine Initiative is an unprecedented large interdisciplinary collaborative effort to conceptualize and test the feasibility of trials incorporating sequencing platforms and large-scale bioinformatics processing that are not currently uniformly available to patients. National Cancer Institute-Molecular Profiling-based Assignment of Cancer Therapy and National Cancer Institute-Molecular Analysis for Therapy Choice are 2 genomic to phenotypic trials under this National Cancer Institute initiative, where treatment is selected according to predetermined genetic alterations detected using next-generation sequencing technology across a broad range of tumor types. In this article, we discuss the objectives and trial designs that have enabled the public-private partnerships required to complete the scale of both trials, as well as interim trial updates and strategic considerations that have driven data analysis and targeted therapy assignment, with the intent of elucidating further the benefits of this treatment approach for patients. Copyright © 2017. Published by Elsevier Inc.
An optical lattice clock with accuracy and stability at the 10(-18) level.
Bloom, B J; Nicholson, T L; Williams, J R; Campbell, S L; Bishof, M; Zhang, X; Zhang, W; Bromley, S L; Ye, J
2014-02-06
Progress in atomic, optical and quantum science has led to rapid improvements in atomic clocks. At the same time, atomic clock research has helped to advance the frontiers of science, affecting both fundamental and applied research. The ability to control quantum states of individual atoms and photons is central to quantum information science and precision measurement, and optical clocks based on single ions have achieved the lowest systematic uncertainty of any frequency standard. Although many-atom lattice clocks have shown advantages in measurement precision over trapped-ion clocks, their accuracy has remained 16 times worse. Here we demonstrate a many-atom system that achieves an accuracy of 6.4 × 10(-18), which is not only better than a single-ion-based clock, but also reduces the required measurement time by two orders of magnitude. By systematically evaluating all known sources of uncertainty, including in situ monitoring of the blackbody radiation environment, we improve the accuracy of optical lattice clocks by a factor of 22. This single clock has simultaneously achieved the best known performance in the key characteristics necessary for consideration as a primary standard-stability and accuracy. More stable and accurate atomic clocks will benefit a wide range of fields, such as the realization and distribution of SI units, the search for time variation of fundamental constants, clock-based geodesy and other precision tests of the fundamental laws of nature. This work also connects to the development of quantum sensors and many-body quantum state engineering (such as spin squeezing) to advance measurement precision beyond the standard quantum limit.
Single-Step BLUP with Varying Genotyping Effort in Open-Pollinated Picea glauca.
Ratcliffe, Blaise; El-Dien, Omnia Gamal; Cappa, Eduardo P; Porth, Ilga; Klápště, Jaroslav; Chen, Charles; El-Kassaby, Yousry A
2017-03-10
Maximization of genetic gain in forest tree breeding programs is contingent on the accuracy of the predicted breeding values and precision of the estimated genetic parameters. We investigated the effect of the combined use of contemporary pedigree information and genomic relatedness estimates on the accuracy of predicted breeding values and precision of estimated genetic parameters, as well as rankings of selection candidates, using single-step genomic evaluation (HBLUP). In this study, two traits with diverse heritabilities [tree height (HT) and wood density (WD)] were assessed at various levels of family genotyping efforts (0, 25, 50, 75, and 100%) from a population of white spruce ( Picea glauca ) consisting of 1694 trees from 214 open-pollinated families, representing 43 provenances in Québec, Canada. The results revealed that HBLUP bivariate analysis is effective in reducing the known bias in heritability estimates of open-pollinated populations, as it exposes hidden relatedness, potential pedigree errors, and inbreeding. The addition of genomic information in the analysis considerably improved the accuracy in breeding value estimates by accounting for both Mendelian sampling and historical coancestry that were not captured by the contemporary pedigree alone. Increasing family genotyping efforts were associated with continuous improvement in model fit, precision of genetic parameters, and breeding value accuracy. Yet, improvements were observed even at minimal genotyping effort, indicating that even modest genotyping effort is effective in improving genetic evaluation. The combined utilization of both pedigree and genomic information may be a cost-effective approach to increase the accuracy of breeding values in forest tree breeding programs where shallow pedigrees and large testing populations are the norm. Copyright © 2017 Ratcliffe et al.
Combining multistate capture-recapture data with tag recoveries to estimate demographic parameters
Kendall, W.L.; Conn, P.B.; Hines, J.E.
2006-01-01
Matrix population models that allow an animal to occupy more than one state over time are important tools for population and evolutionary ecologists. Definition of state can vary, including location for metapopulation models and breeding state for life history models. For populations whose members can be marked and subsequently re-encountered, multistate mark-recapture models are available to estimate the survival and transition probabilities needed to construct population models. Multistate models have proved extremely useful in this context, but they often require a substantial amount of data and restrict estimation of transition probabilities to those areas or states subjected to formal sampling effort. At the same time, for many species, there are considerable tag recovery data provided by the public that could be modeled in order to increase precision and to extend inference to a greater number of areas or states. Here we present a statistical model for combining multistate capture-recapture data (e.g., from a breeding ground study) with multistate tag recovery data (e.g., from wintering grounds). We use this method to analyze data from a study of Canada Geese (Branta canadensis) in the Atlantic Flyway of North America. Our analysis produced marginal improvement in precision, due to relatively few recoveries, but we demonstrate how precision could be further improved with increases in the probability that a retrieved tag is reported.
The development of a virtual camera system for astronaut-rover planetary exploration.
Platt, Donald W; Boy, Guy A
2012-01-01
A virtual assistant is being developed for use by astronauts as they use rovers to explore the surface of other planets. This interactive database, called the Virtual Camera (VC), is an interactive database that allows the user to have better situational awareness for exploration. It can be used for training, data analysis and augmentation of actual surface exploration. This paper describes the development efforts and Human-Computer Interaction considerations for implementing a first-generation VC on a tablet mobile computer device. Scenarios for use will be presented. Evaluation and success criteria such as efficiency in terms of processing time and precision situational awareness, learnability, usability, and robustness will also be presented. Initial testing and the impact of HCI design considerations of manipulation and improvement in situational awareness using a prototype VC will be discussed.
Dirac gauginos, R symmetry and the 125 GeV Higgs
Bertuzzo, Enrico; Frugiuele, Claudia; Gregoire, Thomas; ...
2015-04-20
We study a supersymmetric scenario with a quasi exact R-symmetry in light of the discovery of a Higgs resonance with a mass of 125 GeV. In such a framework, the additional adjoint superfields, needed to give Dirac masses to the gauginos, contribute both to the Higgs mass and to electroweak precision observables. We then analyze the interplay between the two aspects, finding regions in parameter space in which the contributions to the precision observables are under control and a 125 GeV Higgs boson can be accommodated. Furthermore, we estimate the fine-tuning of the model finding regions of the parameter spacemore » still unexplored by the LHC with a fine-tuning considerably improved with respect to the minimal supersymmetric scenario. In particular, sizable non-holomorphic (non-supersoft) adjoints masses are required to reduce the fine-tuning.« less
NASA Astrophysics Data System (ADS)
Schinhaerl, Markus; Schneider, Florian; Rascher, Rolf; Vogt, Christian; Sperber, Peter
2010-10-01
Magnetorheological finishing is a typical commercial application of a computer-controlled polishing process in the manufacturing of precision optical surfaces. Precise knowledge of the material removal characteristic of the polishing tool (influence function) is essential for controlling the material removal on the workpiece surface by the dwell time method. Results from the testing series with magnetorheological finishing have shown that a deviation of only 5% between the actual material removal characteristic of the polishing tool and that represented by the influence function caused a considerable reduction in the polishing quality. The paper discusses reasons for inaccuracies in the influence function and the effects on the polishing quality. The generic results of this research serve for the development of improved polishing strategies, and may be used in alternative applications of computer-controlled polishing processes that quantify the material removal characteristic by influence functions.
Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C
2018-04-14
Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
Optimal actuator placement in adaptive precision trusses
NASA Technical Reports Server (NTRS)
Baycan, C. M.; Utku, S.; Das, S. K.; Wada, B. K.
1992-01-01
Actuator placement in adaptive truss structures is to cater to two needs: displacement control of precision points and preloading the elements to overcome joint slackness. Due to technological and financial considerations, the number of actuators available is much less than the degrees of freedom of precision points to be controlled and the degree of redundancy of the structure. An approach for optimal actuator location is outlined. Test cases to demonstrate the effectiveness of the scheme are applied to the Precision Segmented Reflector Truss.
Precision Metal Fabrication. Florida Vocational Program Guide.
ERIC Educational Resources Information Center
University of South Florida, Tampa. Dept. of Adult and Vocational Education.
This guide identifies considerations in the organization, operation, and evaluation of secondary and postsecondary vocational education programs. It contains both a vocational program guide and Career Merit Achievement Plan (Career MAP) for precision metal fabrication. The guide contains the following sections: occupational description; program…
Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.
LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M
2009-06-01
Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the demonstrated improvement in precision, should help to make apatite paleothermometry far more accessible to paleoclimate researchers. Copyright 2009 John Wiley & Sons, Ltd.
Precise Point Positioning technique for short and long baselines time transfer
NASA Astrophysics Data System (ADS)
Lejba, Pawel; Nawrocki, Jerzy; Lemanski, Dariusz; Foks-Ryznar, Anna; Nogas, Pawel; Dunst, Piotr
2013-04-01
In this work the clock parameters determination of several timing receivers TTS-4 (AOS), ASHTECH Z-XII3T (OP, ORB, PTB, USNO) and SEPTENTRIO POLARX4TR (ORB, since February 11, 2012) by use of the Precise Point Positioning (PPP) technique were presented. The clock parameters were determined for several time links based on the data delivered by time and frequency laboratories mentioned above. The computations cover the period from January 1 to December 31, 2012 and were performed in two modes with 7-day and one-month solution for all links. All RINEX data files which include phase and code GPS data were recorded in 30-second intervals. All calculations were performed by means of Natural Resource Canada's GPS Precise Point Positioning (GPS-PPP) software based on high-quality precise satellite coordinates and satellite clock delivered by IGS as the final products. The used independent PPP technique is a very powerful and simple method which allows for better control of antenna positions in AOS and a verification of other time transfer techniques like GPS CV, GLONASS CV and TWSTFT. The PPP technique is also a very good alternative for calibration of a glass fiber link PL-AOS realized at present by AOS. Currently PPP technique is one of the main time transfer methods used at AOS what considerably improve and strengthen the quality of the Polish time scales UTC(AOS), UTC(PL), and TA(PL). KEY-WORDS: Precise Point Positioning, time transfer, IGS products, GNSS, time scales.
Tang, Tao; Stevenson, R Jan; Infante, Dana M
2016-10-15
Regional variation in both natural environment and human disturbance can influence performance of ecological assessments. In this study we calculated 5 types of benthic diatom multimetric indices (MMIs) with 3 different approaches to account for variation in ecological assessments. We used: site groups defined by ecoregions or diatom typologies; the same or different sets of metrics among site groups; and unmodeled or modeled MMIs, where models accounted for natural variation in metrics within site groups by calculating an expected reference condition for each metric and each site. We used data from the USEPA's National Rivers and Streams Assessment to calculate the MMIs and evaluate changes in MMI performance. MMI performance was evaluated with indices of precision, bias, responsiveness, sensitivity and relevancy which were respectively measured as MMI variation among reference sites, effects of natural variables on MMIs, difference between MMIs at reference and highly disturbed sites, percent of highly disturbed sites properly classified, and relation of MMIs to human disturbance and stressors. All 5 types of MMIs showed considerable discrimination ability. Using different metrics among ecoregions sometimes reduced precision, but it consistently increased responsiveness, sensitivity, and relevancy. Site specific metric modeling reduced bias and increased responsiveness. Combined use of different metrics among site groups and site specific modeling significantly improved MMI performance irrespective of site grouping approach. Compared to ecoregion site classification, grouping sites based on diatom typologies improved precision, but did not improve overall performance of MMIs if we accounted for natural variation in metrics with site specific models. We conclude that using different metrics among ecoregions and site specific metric modeling improve MMI performance, particularly when used together. Applications of these MMI approaches in ecological assessments introduced a tradeoff with assessment consistency when metrics differed across site groups, but they justified the convenient and consistent use of ecoregions. Copyright © 2016 Elsevier B.V. All rights reserved.
OPTIMIZING THE PRECISION OF TOXICITY THRESHOLD ESTIMATION USING A TWO-STAGE EXPERIMENTAL DESIGN
An important consideration for risk assessment is the existence of a threshold, i.e., the highest toxicant dose where the response is not distinguishable from background. We have developed methodology for finding an experimental design that optimizes the precision of threshold mo...
Hu, Chao; Wang, Qianxin; Wang, Zhongyuan; Hernández Moraleda, Alberto
2018-01-01
Currently, five new-generation BeiDou (BDS-3) experimental satellites are working in orbit and broadcast B1I, B3I, and other new signals. Precise satellite orbit determination of the BDS-3 is essential for the future global services of the BeiDou system. However, BDS-3 experimental satellites are mainly tracked by the international GNSS Monitoring and Assessment Service (iGMAS) network. Under the current constraints of the limited data sources and poor data quality of iGMAS, this study proposes an improved cycle-slip detection and repair algorithm, which is based on a polynomial prediction of ionospheric delays. The improved algorithm takes the correlation of ionospheric delays into consideration to accurately estimate and repair cycle slips in the iGMAS data. Moreover, two methods of BDS-3 experimental satellite orbit determination, namely, normal equation stacking (NES) and step-by-step (SS), are designed to strengthen orbit estimations and to make full use of the BeiDou observations in different tracking networks. In addition, a method to improve computational efficiency based on a matrix eigenvalue decomposition algorithm is derived in the NES. Then, one-year of BDS-3 experimental satellite precise orbit determinations were conducted based on iGMAS and Multi-GNSS Experiment (MGEX) networks. Furthermore, the orbit accuracies were analyzed from the discrepancy of overlapping arcs and satellite laser range (SLR) residuals. The results showed that the average three-dimensional root-mean-square error (3D RMS) of one-day overlapping arcs for BDS-3 experimental satellites (C31, C32, C33, and C34) acquired by NES and SS are 31.0, 36.0, 40.3, and 50.1 cm, and 34.6, 39.4, 43.4, and 55.5 cm, respectively; the RMS of SLR residuals are 55.1, 49.6, 61.5, and 70.9 cm and 60.5, 53.6, 65.8, and 73.9 cm, respectively. Finally, one month of observations were used in four schemes of BDS-3 experimental satellite orbit determination to further investigate the reliability and advantages of the improved methods. It was suggested that the scheme with improved cycle-slip detection and repair algorithm based on NES was optimal, which improved the accuracy of BDS-3 experimental satellite orbits by 34.07%, 41.05%, 72.29%, and 74.33%, respectively, compared with the widely-used strategy. Therefore, improved methods for the BDS-3 experimental satellites proposed in this study are very beneficial for the determination of new-generation BeiDou satellite precise orbits. PMID:29724062
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Henry, Michael J.; Burtner, IV, E. R.
The International Atomic Energy Agency (IAEA) is interested in increasing capabilities of IAEA safeguards inspectors to access information that would improve their situational awareness on the job. A mobile information platform could potentially provide access to information, analytics, and technical and logistical support to inspectors in the field, as well as providing regular updates to analysts at IAEA Headquarters in Vienna or at satellite offices. To demonstrate the potential capability of such a system, Pacific Northwest National Laboratory (PNNL) implemented a number of example capabilities within a PNNL-developed precision information environment (PIE), and using a tablet as a mobile informationmore » platform. PNNL’s safeguards proof-of-concept PIE intends to; demonstrate novel applications of mobile information platforms to international safeguards use cases; demonstrate proof-of-principle capability implementation; and provide “vision” for capabilities that could be implemented. This report documents the lessons learned from this two-year development activity for the Precision Information Environment for International Safeguards (PIE-IS), describing the developed capabilities, technical challenges, and considerations for future development, so that developers working to develop a similar system for the IAEA or other safeguards agencies might benefit from our work.« less
NASA Astrophysics Data System (ADS)
Wang, I.-Ting; Chang, Chih-Cheng; Chiu, Li-Wen; Chou, Teyuh; Hou, Tuo-Hung
2016-09-01
The implementation of highly anticipated hardware neural networks (HNNs) hinges largely on the successful development of a low-power, high-density, and reliable analog electronic synaptic array. In this study, we demonstrate a two-layer Ta/TaO x /TiO2/Ti cross-point synaptic array that emulates the high-density three-dimensional network architecture of human brains. Excellent uniformity and reproducibility among intralayer and interlayer cells were realized. Moreover, at least 50 analog synaptic weight states could be precisely controlled with minimal drifting during a cycling endurance test of 5000 training pulses at an operating voltage of 3 V. We also propose a new state-independent bipolar-pulse-training scheme to improve the linearity of weight updates. The improved linearity considerably enhances the fault tolerance of HNNs, thus improving the training accuracy.
Improved Conjugate Gradient Bundle Adjustment of Dunhuang Wall Painting Images
NASA Astrophysics Data System (ADS)
Hu, K.; Huang, X.; You, H.
2017-09-01
Bundle adjustment with additional parameters is identified as a critical step for precise orthoimage generation and 3D reconstruction of Dunhuang wall paintings. Due to the introduction of self-calibration parameters and quasi-planar constraints, the structure of coefficient matrix of the reduced normal equation is banded-bordered, making the solving process of bundle adjustment complex. In this paper, Conjugate Gradient Bundle Adjustment (CGBA) method is deduced by calculus of variations. A preconditioning method based on improved incomplete Cholesky factorization is adopt to reduce the condition number of coefficient matrix, as well as to accelerate the iteration rate of CGBA. Both theoretical analysis and experimental results comparison with conventional method indicate that, the proposed method can effectively conquer the ill-conditioned problem of normal equation and improve the calculation efficiency of bundle adjustment with additional parameters considerably, while maintaining the actual accuracy.
Lai, Rui; Yang, Yin-tang; Zhou, Duan; Li, Yue-jin
2008-08-20
An improved scene-adaptive nonuniformity correction (NUC) algorithm for infrared focal plane arrays (IRFPAs) is proposed. This method simultaneously estimates the infrared detectors' parameters and eliminates the nonuniformity causing fixed pattern noise (FPN) by using a neural network (NN) approach. In the learning process of neuron parameter estimation, the traditional LMS algorithm is substituted with the newly presented variable step size (VSS) normalized least-mean square (NLMS) based adaptive filtering algorithm, which yields faster convergence, smaller misadjustment, and lower computational cost. In addition, a new NN structure is designed to estimate the desired target value, which promotes the calibration precision considerably. The proposed NUC method reaches high correction performance, which is validated by the experimental results quantitatively tested with a simulative testing sequence and a real infrared image sequence.
Precision medicine for nurses: 101.
Lemoine, Colleen
2014-05-01
To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Schuessler, Philipp WH
2010-01-01
In August 2008, Schuessler Consulting was contracted by NASA GSFC in support of the NASA Electronic Parts and Packaging (NEPP) program to perform two separate studies on moisture laden air in a stainless steel cylinder that had been designed to become a consensus standard for Test Method 1018. This Test Method was originally released for hybrids under Mil. Std. 883 but was quickly utilized on other microelectronic devices under the auspice of Mil. Std. 750. The cylinder had subsequently been fabricated for the 750 community. It was back-filled with moist air and subsequently analyzed over a period of time under a previous NASA contract. It had been shown that moisture in the 4000 - 5000 ppm range could be analyzed rather precisely with a mass spectrometer, commonly referred to as a Residual Gas Analyzer (RGA). The scope of this study was to ascertain if the composition and precision varied as a function of thermal shock at sub-zero temperatures and whether there was consensus when the standard was submitted to other RGA units. It was demonstrated and published that the consensus standard would yield precise RGA data for moisture within +/- 1% when optimized for a given RGA unit. It has been subsequently shown in this study at Oneida Research Services, that sub-zero storage did not affect that precision when a well-defined protocol for the analysis was followed. The consensus standard was taken to a second facility for analysis where it was found that moisture adsorption on the transfer lines caused precision to drop to +/- 12%. The Single Sample Cylinder (SSC) is a one liter stainless steel cylinder with associated sampling valves and has considerable weight and volume. But this considerable size allows for approximately 300 gas samples of the same composition to be delivered to any RGA unit. Lastly, a smaller cylinder, approximately 75 cc, of a second consensus standard was fabricated and tested with a different mix of fixed gases where moisture was kept in the 100 ppm range. This second standard has the potential of providing 30 gaseous samples and can be readily shipped to any analytical facility that desires to generate comparison RGA data. A series of comparison residual gas analyses was performed at the Honeywell Federal Manufacturing & Technologies facility in the National Nuclear Facility Administration s plant in Kansas City to complete this project. It was shown that improvements in the precision of a given RGA unit can be done by controlling the cycle time for each analysis and increasing analysis temperatures to minimize moisture adsorption. It was also found that a "one time event" in the subzero storage of the large SSC did not effect the units ability to continuously supply precise samples of the same chemistry, however the "event" caused a permanent +8% shift in the reported value of the moisture content. Lastly, a set of SSC RGA results was plotted on a common graph with DSCC "correlation study" RGA data. The result demonstrates the ability of the SSC to remove many of the individual variances that single, individual samples introduce. The consensus standards are now in storage at Oneida Research Services, one of the DSCC certified houses that does RGA to Military Standards, where they await future studies. The analytical data and the operational parameters of the instruments used are provided in the following discussion. Limitations and suggested means for improvement of both precision and accuracy are provided.
A Gaussian Mixture Model for Nulling Pulsars
NASA Astrophysics Data System (ADS)
Kaplan, D. L.; Swiggum, J. K.; Fichtenbauer, T. D. J.; Vallisneri, M.
2018-03-01
The phenomenon of pulsar nulling—where pulsars occasionally turn off for one or more pulses—provides insight into pulsar-emission mechanisms and the processes by which pulsars turn off when they cross the “death line.” However, while ever more pulsars are found that exhibit nulling behavior, the statistical techniques used to measure nulling are biased, with limited utility and precision. In this paper, we introduce an improved algorithm, based on Gaussian mixture models, for measuring pulsar nulling behavior. We demonstrate this algorithm on a number of pulsars observed as part of a larger sample of nulling pulsars, and show that it performs considerably better than existing techniques, yielding better precision and no bias. We further validate our algorithm on simulated data. Our algorithm is widely applicable to a large number of pulsars even if they do not show obvious nulls. Moreover, it can be used to derive nulling probabilities of nulling for individual pulses, which can be used for in-depth studies.
Construction and testing of a simple and economical soil greenhouse gas automatic sampler
Ginting, D.; Arnold, S.L.; Arnold, N.S.; Tubbs, R.S.
2007-01-01
Quantification of soil greenhouse gas emissions requires considerable sampling to account for spatial and/or temporal variation. With manual sampling, additional personnel are often not available to sample multiple sites within a narrow time interval. The objectives were to construct an automatic gas sampler and to compare the accuracy and precision of automatic versus manual sampling. The automatic sampler was tested with carbon dioxide (CO2) fluxes that mimicked the range of CO2 fluxes during a typical corn-growing season in eastern Nebraska. Gas samples were drawn from the chamber at 0, 5, and 10 min manually and with the automatic sampler. The three samples drawn with the automatic sampler were transferred to pre-vacuumed vials after 1 h; thus the samples in syringe barrels stayed connected with the increasing CO2 concentration in the chamber. The automatic sampler sustains accuracy and precision in greenhouse gas sampling while improving time efficiency and reducing labor stress. Copyright ?? Taylor & Francis Group, LLC.
Rotation of an optically trapped vaterite microsphere measured using rotational Doppler effect
NASA Astrophysics Data System (ADS)
Chen, Xinlin; Xiao, Guangzong; Xiong, Wei; Yang, Kaiyong; Luo, Hui; Yao, Baoli
2018-03-01
The angular velocity of a vaterite microsphere spinning in the optical trap is measured using rotational Doppler effect. The perfectly spherical vaterite microspheres are synthesized via coprecipitation in the presence of silk fibroin nanospheres. When trapped by a circularly polarized beam, the vaterite microsphere is uniformly rotated in the trap center. The probe beams containing two Laguerre-Gaussian beams of opposite topological charge l = ± 7, l = ± 8, and l = ± 9 are illuminated on the spinning vaterite. By analyzing the backscattered light, a frequency shift is observed scaling with the rotation rate of the vaterite microsphere. The multiplicative enhancement of the frequency shift proportion to the topological charge has greatly improved the measurement precision. The reliability and practicability of this approach are verified through varying the topological charge of the probe beam and the trapping laser power. In consideration of the excellent measurement precision of the rotation frequency, this technique might be generally applicable in studying the torsional properties of micro-objects.
The present and future role of microfluidics in biomedical research.
Sackmann, Eric K; Fulton, Anna L; Beebe, David J
2014-03-13
Microfluidics, a technology characterized by the engineered manipulation of fluids at the submillimetre scale, has shown considerable promise for improving diagnostics and biology research. Certain properties of microfluidic technologies, such as rapid sample processing and the precise control of fluids in an assay, have made them attractive candidates to replace traditional experimental approaches. Here we analyse the progress made by lab-on-a-chip microtechnologies in recent years, and discuss the clinical and research areas in which they have made the greatest impact. We also suggest directions that biologists, engineers and clinicians can take to help this technology live up to its potential.
Diagnostic imaging of child abuse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleinman, P.K.
1987-01-01
This book provides a description for all the known radiological alterations occurring in child abuse. This allows for precise interpretation of findings by radiologists. It also helps eliminate the confusion among both clinicians and non-medical personnel involved in the diagnosis, management, and legal issues related to child abuse. CONTENTS: Introduction; Skeletal trauma: general considerations; Extremity trauma; Bony thoracic trauma; Spinal trauma; Dating fractures; Visceral trauma; Head trauma; Miscellaneous forms of abuse and neglect; The postmortem examination; Differential diagnosis of child abuse; Legal considerations; Psychosocial considerations; Technical considerations and dosimetry.
Innovation, productivity, and pricing: Capturing value from precision medicine technology in Canada.
Emery, J C Herbert; Zwicker, Jennifer D
2017-07-01
For new technology and innovation such as precision medicine to become part of the solution for the fiscal sustainability of Canadian Medicare, decision-makers need to change how services are priced rather than trying to restrain emerging technologies like precision medicine for short-term cost savings. If provincial public payers shift their thinking to be public purchasers, value considerations would direct reform of the reimbursement system to have prices that adjust with technologically driven productivity gains. This strategic shift in thinking is necessary if Canadians are to benefit from the promised benefits of innovations like precision medicine.
Hall, William A; Bergom, Carmen; Thompson, Reid F; Baschnagel, Andrew M; Vijayakumar, Srinivasan; Willers, Henning; Li, X Allen; Schultz, Christopher J; Wilson, George D; West, Catharine M L; Capala, Jacek; Coleman, C Norman; Torres-Roca, Javier F; Weidhaas, Joanne; Feng, Felix Y
2018-06-01
To summarize important talking points from a 2016 symposium focusing on real-world challenges to advancing precision medicine in radiation oncology, and to help radiation oncologists navigate the practical challenges of precision, radiation oncology. The American Society for Radiation Oncology, American Association of Physicists in Medicine, and National Cancer Institute cosponsored a meeting on precision medicine in radiation oncology. In June 2016 numerous scientists, clinicians, and physicists convened at the National Institutes of Health to discuss challenges and future directions toward personalized radiation therapy. Various breakout sessions were held to discuss particular components and approaches to the implementation of personalized radiation oncology. This article summarizes the genomically guided radiation therapy breakout session. A summary of existing genomic data enabling personalized radiation therapy, ongoing clinical trials, current challenges, and future directions was collected. The group attempted to provide both a current overview of data that radiation oncologists could use to personalize therapy, along with data that are anticipated in the coming years. It seems apparent from the provided review that a considerable opportunity exists to truly bring genomically guided radiation therapy into clinical reality. Genomically guided radiation therapy is a necessity that must be embraced in the coming years. Incorporating these data into treatment recommendations will provide radiation oncologists with a substantial opportunity to improve outcomes for numerous cancer patients. More research focused on this topic is needed to bring genomic signatures into routine standard of care. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
Assessment of physical activity of the human body considering the thermodynamic system.
Hochstein, Stefan; Rauschenberger, Philipp; Weigand, Bernhard; Siebert, Tobias; Schmitt, Syn; Schlicht, Wolfgang; Převorovská, Světlana; Maršík, František
2016-01-01
Correctly dosed physical activity is the basis of a vital and healthy life, but the measurement of physical activity is certainly rather empirical resulting in limited individual and custom activity recommendations. Certainly, very accurate three-dimensional models of the cardiovascular system exist, however, requiring the numeric solution of the Navier-Stokes equations of the flow in blood vessels. These models are suitable for the research of cardiac diseases, but computationally very expensive. Direct measurements are expensive and often not applicable outside laboratories. This paper offers a new approach to assess physical activity using thermodynamical systems and its leading quantity of entropy production which is a compromise between computation time and precise prediction of pressure, volume, and flow variables in blood vessels. Based on a simplified (one-dimensional) model of the cardiovascular system of the human body, we develop and evaluate a setup calculating entropy production of the heart to determine the intensity of human physical activity in a more precise way than previous parameters, e.g. frequently used energy considerations. The knowledge resulting from the precise real-time physical activity provides the basis for an intelligent human-technology interaction allowing to steadily adjust the degree of physical activity according to the actual individual performance level and thus to improve training and activity recommendations.
Consideration of neutral beam prompt loss in the design of a tokamak helicon antenna
Pace, D. C.; Van Zeeland, M. A.; Fishler, B.; ...
2016-08-02
Neutral beam prompt losses (injected neutrals that ionize such that their first poloidal transit intersects with the wall) can put appreciable power on the outer wall of tokamaks, and this power may damage the wall or other internal components. These prompt losses are simulated including a protruding helicon antenna installation in the DIII-D tokamak and it is determined that 160 kW of power will impact the antenna during the injection of a particular neutral beam. Protective graphite tiles are designed in response to this modeling and the wall shape of the installed antenna is precisely measured to improve the accuracymore » of these calculations. Initial experiments con rm that the antenna component temperature increases according to the amount of neutral beam energy injected into the plasma. Incorporating neutral beam prompt loss considerations into the design of this in-vessel component serves to ensure that adequate protection or cooling is provided.« less
Consideration of neutral beam prompt loss in the design of a tokamak helicon antenna
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pace, D. C.; Van Zeeland, M. A.; Fishler, B.
Neutral beam prompt losses (injected neutrals that ionize such that their first poloidal transit intersects with the wall) can put appreciable power on the outer wall of tokamaks, and this power may damage the wall or other internal components. These prompt losses are simulated including a protruding helicon antenna installation in the DIII-D tokamak and it is determined that 160 kW of power will impact the antenna during the injection of a particular neutral beam. Protective graphite tiles are designed in response to this modeling and the wall shape of the installed antenna is precisely measured to improve the accuracymore » of these calculations. Initial experiments con rm that the antenna component temperature increases according to the amount of neutral beam energy injected into the plasma. Incorporating neutral beam prompt loss considerations into the design of this in-vessel component serves to ensure that adequate protection or cooling is provided.« less
Why Targeted Therapies are Necessary for Systemic Lupus Erythematosus
Durcan, Laura; Petri, Michelle
2016-01-01
Systemic lupus erythematosus (SLE) continues to have important morbidity and accelerated mortality despite therapeutic advances. Targeted therapies offer the possibility of improved efficacy with fewer side-effects. Current management strategies rely heavily on non-specific immunosuppressive agents. Prednisone, in particular, is responsible for a considerable burden of later organ damage. There are a multitude of diverse mechanisms of disease activity, immunogenic abnormalities and clinical manifestations to take into consideration in SLE. Many targeted agents with robust mechanistic pre-clinical data and promising early phase studies have ultimately been disappointing in phase III randomized controlled studies. Recent efforts have focused on B cell therapies, in particular given the success of belimumab in clinical trials, with limited success. We remain optimistic regarding other specific therapies being evaluated including interferon alpha blockade. It is likely that in SLE, given the heterogeneity of the population involved, precision medicine is needed, rather than expecting that any single biologic will be universally effective. PMID:27497251
Proceedings of the Workshop on Improvements to Photometry
NASA Technical Reports Server (NTRS)
Borucki, W. J. (Editor); Young, A. T. (Editor)
1984-01-01
The purposes of the workshop were to determine what astronomical problems would benefit by increased photometric precision, determine the current level of precision, identify the processes limiting the precision, and recommend approaches to improving photometric precision. Twenty representatives of the university, industry, and government communities participated. Results and recommendations are discussed.
NASA Astrophysics Data System (ADS)
Nomade, Sebastien, ,, Dr.; Pereira, MSc. Alison; voinchet, Pierre, ,, Dr.; Bahain, Jean-Jacques, ,, Dr.; Aureli, Daniele, ,, Dr.; Arzarello, Marta, ,, Dr.; Anzidei, Anna-Paola, ,, Dr.; Biddittu, Italo, ,, Dr.; Bulgarelli, Maria-Grazia, ,, Dr.; Falguères, Christophe, ,, Dr.; Giaccio, Biagio, ,, Dr.; Guillou, Hervé, ,, Dr.; Manzi, Gorgio, ,, Dr.; Moncel, Marie-Hélène, ,, Dr.; Nicoud, Elisa, ,, Dr.; Pagli, Maria, ,, Dr.; Parenti, Fabio, ,, Dr.; Peretto, Carlo, ,, Dr.; Piperno, Marcello, ,, Dr.; Rocca, Roxane, ,, Dr.
2017-04-01
European Middle-Pleistocene archaeological and/or paleontological sites lack a unified and precise chronological framework. Despite recent efforts mostly focused on methods such as OSL, ESR/U-series or cosmogenic nuclides, the age of numerous sites from this period fundamentally still relies on qualitative and speculative palaeoenvironmental and/or palaeontological/palaeoanthropological considerations. The lack of robust chronologies, along with the scarcity of human fossils, prevent coherent correlations between European sites which in turn limits our understanding of human diffusion dynamics, understand techno-cultural evolution or correlate archaeological sites with palaeoclimatic and environmental records. With the goal of providing an accurate and precise chronological framework based on a multi-method approach, a research network including geochronologists, archaeologist and paleoanthropologists from various French and Italian institutions launched in 2010 a wide study of Middle-Pleistocene archaeological sites of central and southern Italy. This study combining the 39Ar/40Ar method with palaeo-dosimetric methods applied to European sites in the age range of 700 ka to 300 ka is unprecedented. In parallel, a large effort has been done to improve the regional Middle-Pleistocene tephrostratigraphic database through a massive application of both high-precision 40Ar/39Ar geochronological and geochemical investigations. We illustrate our approach and results in addressing several key-sites such as Notarchirico, Valle Giumentina; Ceprano-Campogrande and La Polledrara di Cecanibbio. The accurate and precise chronological framework we built permits us to replace all the investigated archaeological and palaeontological records into a coherent climatic and environmental context. Furthermore, our work provides the opportunity to compare lithic industries from a technical and evolutionary point of view within a homogeneous temporal frame. These preliminary results border the current limitations of the 40Ar/39Ar method and will guide expected advances to apply our approach to other European sites.
Störmer, M; Gabrisch, H; Horstmann, C; Heidorn, U; Hertlein, F; Wiesmann, J; Siewert, F; Rack, A
2016-05-01
X-ray mirrors are needed for beam shaping and monochromatization at advanced research light sources, for instance, free-electron lasers and synchrotron sources. Such mirrors consist of a substrate and a coating. The shape accuracy of the substrate and the layer precision of the coating are the crucial parameters that determine the beam properties required for various applications. In principal, the selection of the layer materials determines the mirror reflectivity. A single layer mirror offers high reflectivity in the range of total external reflection, whereas the reflectivity is reduced considerably above the critical angle. A periodic multilayer can enhance the reflectivity at higher angles due to Bragg reflection. Here, the selection of a suitable combination of layer materials is essential to achieve a high flux at distinct photon energies, which is often required for applications such as microtomography, diffraction, or protein crystallography. This contribution presents the current development of a Ru/C multilayer mirror prepared by magnetron sputtering with a sputtering facility that was designed in-house at the Helmholtz-Zentrum Geesthacht. The deposition conditions were optimized in order to achieve ultra-high precision and high flux in future mirrors. Input for the improved deposition parameters came from investigations by transmission electron microscopy. The X-ray optical properties were investigated by means of X-ray reflectometry using Cu- and Mo-radiation. The change of the multilayer d-spacing over the mirror dimensions and the variation of the Bragg angles were determined. The results demonstrate the ability to precisely control the variation in thickness over the whole mirror length of 500 mm thus achieving picometer-precision in the meter-range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Störmer, M., E-mail: michael.stoermer@hzg.de; Gabrisch, H.; Horstmann, C.
2016-05-15
X-ray mirrors are needed for beam shaping and monochromatization at advanced research light sources, for instance, free-electron lasers and synchrotron sources. Such mirrors consist of a substrate and a coating. The shape accuracy of the substrate and the layer precision of the coating are the crucial parameters that determine the beam properties required for various applications. In principal, the selection of the layer materials determines the mirror reflectivity. A single layer mirror offers high reflectivity in the range of total external reflection, whereas the reflectivity is reduced considerably above the critical angle. A periodic multilayer can enhance the reflectivity atmore » higher angles due to Bragg reflection. Here, the selection of a suitable combination of layer materials is essential to achieve a high flux at distinct photon energies, which is often required for applications such as microtomography, diffraction, or protein crystallography. This contribution presents the current development of a Ru/C multilayer mirror prepared by magnetron sputtering with a sputtering facility that was designed in-house at the Helmholtz-Zentrum Geesthacht. The deposition conditions were optimized in order to achieve ultra-high precision and high flux in future mirrors. Input for the improved deposition parameters came from investigations by transmission electron microscopy. The X-ray optical properties were investigated by means of X-ray reflectometry using Cu- and Mo-radiation. The change of the multilayer d-spacing over the mirror dimensions and the variation of the Bragg angles were determined. The results demonstrate the ability to precisely control the variation in thickness over the whole mirror length of 500 mm thus achieving picometer-precision in the meter-range.« less
Trace element analysis by EPMA in geosciences: detection limit, precision and accuracy
NASA Astrophysics Data System (ADS)
Batanova, V. G.; Sobolev, A. V.; Magnin, V.
2018-01-01
Use of the electron probe microanalyser (EPMA) for trace element analysis has increased over the last decade, mainly because of improved stability of spectrometers and the electron column when operated at high probe current; development of new large-area crystal monochromators and ultra-high count rate spectrometers; full integration of energy-dispersive / wavelength-dispersive X-ray spectrometry (EDS/WDS) signals; and the development of powerful software packages. For phases that are stable under a dense electron beam, the detection limit and precision can be decreased to the ppm level by using high acceleration voltage and beam current combined with long counting time. Data on 10 elements (Na, Al, P, Ca, Ti, Cr, Mn, Co, Ni, Zn) in olivine obtained on a JEOL JXA-8230 microprobe with tungsten filament show that the detection limit decreases proportionally to the square root of counting time and probe current. For all elements equal or heavier than phosphorus (Z = 15), the detection limit decreases with increasing accelerating voltage. The analytical precision for minor and trace elements analysed in olivine at 25 kV accelerating voltage and 900 nA beam current is 4 - 18 ppm (2 standard deviations of repeated measurements of the olivine reference sample) and is similar to the detection limit of corresponding elements. To analyse trace elements accurately requires careful estimation of background, and consideration of sample damage under the beam and secondary fluorescence from phase boundaries. The development and use of matrix reference samples with well-characterised trace elements of interest is important for monitoring and improving of the accuracy. An evaluation of the accuracy of trace element analyses in olivine has been made by comparing EPMA data for new reference samples with data obtained by different in-situ and bulk analytical methods in six different laboratories worldwide. For all elements, the measured concentrations in the olivine reference sample were found to be identical (within internal precision) to reference values, suggesting that achieved precision and accuracy are similar. The spatial resolution of EPMA in a silicate matrix, even at very extreme conditions (accelerating voltage 25 kV), does not exceed 7 - 8 μm and thus is still better than laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) or secondary ion mass spectrometry (SIMS) of similar precision. These make the electron microprobe an indispensable method with applications in experimental petrology, geochemistry and cosmochemistry.
Managing Innovation to Maximize Value Along the Discovery-Translation-Application Continuum.
Waldman, S A; Terzic, A
2017-01-01
Success in pharmaceutical development led to a record 51 drugs approved in the past year, surpassing every previous year since 1950. Technology innovation enabled identification and exploitation of increasingly precise disease targets ensuring next generation diagnostic and therapeutic products for patient management. The expanding biopharmaceutical portfolio stands, however, in contradistinction to the unsustainable costs that reflect remarkable challenges of clinical development programs. This annual Therapeutic Innovations issue juxtaposes advances in translating molecular breakthroughs into transformative therapies with essential considerations for lowering attrition and improving the cost-effectiveness of the drug-development paradigm. Realizing the discovery-translation-application continuum mandates a congruent approval, adoption, and access triad. © 2016 ASCPT.
Managing Innovation to Maximize Value Along the Discovery-Translation-Application Continuum
Waldman, SA; Terzic, A
2017-01-01
Success in pharmaceutical development led to a record 51 drug approved in the past year, surpassing every previous year since 1950. Technology innovation enabled identification and exploitation of increasingly precise disease targets ensuring a next generation diagnostic and therapeutic products for patient management. The expanding biopharmaceutical portfolio stands however in contradistinction to the unsustainable costs that reflect remarkable challenges of clinical development programs. This annual Therapeutic Innovations issue juxtaposes advances in translating molecular breakthroughs into transformative therapies with essential considerations for lowering attrition and improving the cost-effectiveness of the drug development paradigm. Realizing the discovery-translation-application continuum mandates a congruent approval, adoption and access triad. PMID:27869291
Gravity discharge vessel revisited: An explicit Lambert W function solution
NASA Astrophysics Data System (ADS)
Digilov, Rafael M.
2017-07-01
Based on the generalized Poiseuille equation modified by a kinetic energy correction, an explicit solution for the time evolution of a liquid column draining under gravity through an exit capillary tube is derived in terms of the Lambert W function. In contrast to the conventional exponential behavior, as implied by the Poiseuille law, a new analytical solution gives a full account for the volumetric flow rate of a fluid through a capillary of any length and improves the precision of viscosity determination. The theoretical consideration may be of interest to students as an example of how implicit equations in the field of physics can be solved analytically using the Lambert function.
Model of bidirectional reflectance distribution function for metallic materials
NASA Astrophysics Data System (ADS)
Wang, Kai; Zhu, Jing-Ping; Liu, Hong; Hou, Xun
2016-09-01
Based on the three-component assumption that the reflection is divided into specular reflection, directional diffuse reflection, and ideal diffuse reflection, a bidirectional reflectance distribution function (BRDF) model of metallic materials is presented. Compared with the two-component assumption that the reflection is composed of specular reflection and diffuse reflection, the three-component assumption divides the diffuse reflection into directional diffuse and ideal diffuse reflection. This model effectively resolves the problem that constant diffuse reflection leads to considerable error for metallic materials. Simulation and measurement results validate that this three-component BRDF model can improve the modeling accuracy significantly and describe the reflection properties in the hemisphere space precisely for the metallic materials.
Search Filter Precision Can Be Improved By NOTing Out Irrelevant Content
Wilczynski, Nancy L.; McKibbon, K. Ann; Haynes, R. Brian
2011-01-01
Background: Most methodologic search filters developed for use in large electronic databases such as MEDLINE have low precision. One method that has been proposed but not tested for improving precision is NOTing out irrelevant content. Objective: To determine if search filter precision can be improved by NOTing out the text words and index terms assigned to those articles that are retrieved but are off-target. Design: Analytic survey. Methods: NOTing out unique terms in off-target articles and testing search filter performance in the Clinical Hedges Database. Main Outcome Measures: Sensitivity, specificity, precision and number needed to read (NNR). Results: For all purpose categories (diagnosis, prognosis and etiology) except treatment and for all databases (MEDLINE, EMBASE, CINAHL and PsycINFO), constructing search filters that NOTed out irrelevant content resulted in substantive improvements in NNR (over four-fold for some purpose categories and databases). Conclusion: Search filter precision can be improved by NOTing out irrelevant content. PMID:22195215
Expression microdissection adapted to commercial laser dissection instruments
Hanson, Jeffrey C; Tangrea, Michael A; Kim, Skye; Armani, Michael D; Pohida, Thomas J; Bonner, Robert F; Rodriguez-Canales, Jaime; Emmert-Buck, Michael R
2016-01-01
Laser-based microdissection facilitates the isolation of specific cell populations from clinical or animal model tissue specimens for molecular analysis. Expression microdissection (xMD) is a second-generation technology that offers considerable advantages in dissection capabilities; however, until recently the method has not been accessible to investigators. This protocol describes the adaptation of xMD to commonly used laser microdissection instruments and to a commercially available handheld laser device in order to make the technique widely available to the biomedical research community. The method improves dissection speed for many applications by using a targeting probe for cell procurement in place of an operator-based, cell-by-cell selection process. Moreover, xMD can provide improved dissection precision because of the unique characteristics of film activation. The time to complete the protocol is highly dependent on the target cell population and the number of cells needed for subsequent molecular analysis. PMID:21412274
Phase estimation of coherent states with a noiseless linear amplifier
NASA Astrophysics Data System (ADS)
Assad, Syed M.; Bradshaw, Mark; Lam, Ping Koy
Amplification of quantum states is inevitably accompanied with the introduction of noise at the output. For protocols that are probabilistic with heralded success, noiseless linear amplification in theory may still be possible. When the protocol is successful, it can lead to an output that is a noiselessly amplified copy of the input. When the protocol is unsuccessful, the output state is degraded and is usually discarded. Probabilistic protocols may improve the performance of some quantum information protocols, but not for metrology if the whole statistics is taken into consideration. We calculate the precision limits on estimating the phase of coherent states using a noiseless linear amplifier by computing its quantum Fisher information and we show that on average, the noiseless linear amplifier does not improve the phase estimate. We also discuss the case where abstention from measurement can reduce the cost for estimation.
LORAN-C LATITUDE-LONGITUDE CONVERSION AT SEA: PROGRAMMING CONSIDERATIONS.
McCullough, James R.; Irwin, Barry J.; Bowles, Robert M.
1985-01-01
Comparisons are made of the precision of arc-length routines as computer precision is reduced. Overland propagation delays are discussed and illustrated with observations from offshore New England. Present practice of LORAN-C error budget modeling is then reviewed with the suggestion that additional terms be considered in future modeling. Finally, some detailed numeric examples are provided to help with new computer program checkout.
Research Ethics Considerations Regarding the Cancer Moonshot Initiative.
Hammer, Marilyn J
2016-07-01
If the Precision Medicine Initiative was the launching pad, the Cancer Moonshot Initiative is the liftoff. A billion-dollar mission to "eliminate cancer as we know it," the Cancer Moonshot Initiative underscores the Precision Medicine Initiative's near-term focus in oncology research and translation. Spearheaded by Vice President Biden, the goal is to condense a decade of research into actionable results within five years.
McAlinden, Colm; Khadka, Jyoti; Pesudovs, Konrad
2011-07-01
The ever-expanding choice of ocular metrology and imaging equipment has driven research into the validity of their measurements. Consequently, studies of the agreement between two instruments or clinical tests have proliferated in the ophthalmic literature. It is important that researchers apply the appropriate statistical tests in agreement studies. Correlation coefficients are hazardous and should be avoided. The 'limits of agreement' method originally proposed by Altman and Bland in 1983 is the statistical procedure of choice. Its step-by-step use and practical considerations in relation to optometry and ophthalmology are detailed in addition to sample size considerations and statistical approaches to precision (repeatability or reproducibility) estimates. Ophthalmic & Physiological Optics © 2011 The College of Optometrists.
Dickie, Ben R; Banerji, Anita; Kershaw, Lucy E; McPartlin, Andrew; Choudhury, Ananya; West, Catharine M; Rose, Chris J
2016-10-01
To improve the accuracy and precision of tracer kinetic model parameter estimates for use in dynamic contrast enhanced (DCE) MRI studies of solid tumors. Quantitative DCE-MRI requires an estimate of precontrast T1 , which is obtained prior to fitting a tracer kinetic model. As T1 mapping and tracer kinetic signal models are both a function of precontrast T1 it was hypothesized that its joint estimation would improve the accuracy and precision of both precontrast T1 and tracer kinetic model parameters. Accuracy and/or precision of two-compartment exchange model (2CXM) parameters were evaluated for standard and joint fitting methods in well-controlled synthetic data and for 36 bladder cancer patients. Methods were compared under a number of experimental conditions. In synthetic data, joint estimation led to statistically significant improvements in the accuracy of estimated parameters in 30 of 42 conditions (improvements between 1.8% and 49%). Reduced accuracy was observed in 7 of the remaining 12 conditions. Significant improvements in precision were observed in 35 of 42 conditions (between 4.7% and 50%). In clinical data, significant improvements in precision were observed in 18 of 21 conditions (between 4.6% and 38%). Accuracy and precision of DCE-MRI parameter estimates are improved when signal models are fit jointly rather than sequentially. Magn Reson Med 76:1270-1281, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Hayes, Mark A.; Ozenberger, Katharine; Cryan, Paul M.; Wunder, Michael B.
2015-01-01
Bat specimens held in natural history museum collections can provide insights into the distribution of species. However, there are several important sources of spatial error associated with natural history specimens that may influence the analysis and mapping of bat species distributions. We analyzed the importance of geographic referencing and error correction in species distribution modeling (SDM) using occurrence records of hoary bats (Lasiurus cinereus). This species is known to migrate long distances and is a species of increasing concern due to fatalities documented at wind energy facilities in North America. We used 3,215 museum occurrence records collected from 1950–2000 for hoary bats in North America. We compared SDM performance using five approaches: generalized linear models, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy models. We evaluated results using three SDM performance metrics (AUC, sensitivity, and specificity) and two data sets: one comprised of the original occurrence data, and a second data set consisting of these same records after the locations were adjusted to correct for identifiable spatial errors. The increase in precision improved the mean estimated spatial error associated with hoary bat records from 5.11 km to 1.58 km, and this reduction in error resulted in a slight increase in all three SDM performance metrics. These results provide insights into the importance of geographic referencing and the value of correcting spatial errors in modeling the distribution of a wide-ranging bat species. We conclude that the considerable time and effort invested in carefully increasing the precision of the occurrence locations in this data set was not worth the marginal gains in improved SDM performance, and it seems likely that gains would be similar for other bat species that range across large areas of the continent, migrate, and are habitat generalists.
Bias in diet determination: incorporating traditional methods in Bayesian mixing models.
Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo
2013-01-01
There are not "universal methods" to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators' diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal's diet the sea lion's did not have a clear dominance of any prey. In contrast, SIMM-IP's diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs' estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys' contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators' diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method.
Bias in Diet Determination: Incorporating Traditional Methods in Bayesian Mixing Models
Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G.; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo
2013-01-01
There are not “universal methods” to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators’ diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal’s diet the sea lion’s did not have a clear dominance of any prey. In contrast, SIMM-IP’s diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs’ estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys’ contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators’ diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method. PMID:24224031
State-of-the-art and emerging technologies for atrial fibrillation ablation.
Dewire, Jane; Calkins, Hugh
2010-03-01
Catheter ablation is an important treatment modality for patients with atrial fibrillation (AF). Although the superiority of catheter ablation over antiarrhythmic drug therapy has been demonstrated in middle-aged patients with paroxysmal AF, the role the procedure in other patient subgroups-particularly those with long-standing persistent AF-has not been well defined. Furthermore, although AF ablation can be performed with reasonable efficacy and safety by experienced operators, long-term success rates for single procedures are suboptimal. Fortunately, extensive ongoing research will improve our understanding of the mechanisms of AF, and considerable funds are being invested in developing new ablation technologies to improve patient outcomes. These technologies include ablation catheters designed to electrically isolate the pulmonary veins with improved safety, efficacy, and speed, catheters designed to deliver radiofrequency energy with improved precision, robotic systems to address the technological demands of the procedure, improved imaging and electrical mapping systems, and MRI-guided ablation strategies. The tools, technologies, and techniques that will ultimately stand the test of time and become the standard approach to AF ablation in the future remain unclear. However, technological advances are sure to result in the necessary improvements in the safety and efficacy of AF ablation procedures.
NASA Astrophysics Data System (ADS)
Guo, Liyan; Xia, Changliang; Wang, Huimin; Wang, Zhiqiang; Shi, Tingna
2018-05-01
As is well known, the armature current will be ahead of the back electromotive force (back-EMF) under load condition of the interior permanent magnet (PM) machine. This kind of advanced armature current will produce a demagnetizing field, which may make irreversible demagnetization appeared in PMs easily. To estimate the working points of PMs more accurately and take demagnetization under consideration in the early design stage of a machine, an improved equivalent magnetic network model is established in this paper. Each PM under each magnetic pole is segmented, and the networks in the rotor pole shoe are refined, which makes a more precise model of the flux path in the rotor pole shoe possible. The working point of each PM under each magnetic pole can be calculated accurately by the established improved equivalent magnetic network model. Meanwhile, the calculated results are compared with those calculated by FEM. And the effects of d-axis component and q-axis component of armature current, air-gap length and flux barrier size on working points of PMs are analyzed by the improved equivalent magnetic network model.
Rubinsten, Orly
2015-01-01
In recent years, cognitive neuroscience research has identified several biological and cognitive features of number processing deficits that may now make it possible to diagnose mental or educational impairments in arithmetic, even earlier and more precisely than is possible using traditional assessment tools. We provide two sets of recommendations for improving cognitive assessment tools, using the important case of mathematics as an example. (1) neurocognitive tests would benefit substantially from incorporating assessments (based on findings from cognitive neuroscience) that entail systematic manipulation of fundamental aspects of number processing. Tests that focus on evaluating networks of core neurocognitive deficits have considerable potential to lead to more precise diagnosis and to provide the basis for designing specific intervention programs tailored to the deficits exhibited by the individual child. (2) implicit knowledge, derived from inspection of variables that are irrelevant to the task at hand, can also provide a useful assessment tool. Implicit knowledge is powerful and plays an important role in human development, especially in cases of psychiatric or neurological deficiencies (such as math learning disabilities or math anxiety).
Population density estimated from locations of individuals on a passive detector array
Efford, Murray G.; Dawson, Deanna K.; Borchers, David L.
2009-01-01
The density of a closed population of animals occupying stable home ranges may be estimated from detections of individuals on an array of detectors, using newly developed methods for spatially explicit capture–recapture. Likelihood-based methods provide estimates for data from multi-catch traps or from devices that record presence without restricting animal movement ("proximity" detectors such as camera traps and hair snags). As originally proposed, these methods require multiple sampling intervals. We show that equally precise and unbiased estimates may be obtained from a single sampling interval, using only the spatial pattern of detections. This considerably extends the range of possible applications, and we illustrate the potential by estimating density from simulated detections of bird vocalizations on a microphone array. Acoustic detection can be defined as occurring when received signal strength exceeds a threshold. We suggest detection models for binary acoustic data, and for continuous data comprising measurements of all signals above the threshold. While binary data are often sufficient for density estimation, modeling signal strength improves precision when the microphone array is small.
NASA Astrophysics Data System (ADS)
Wang, Y. P.; Lu, Z. P.; Sun, D. S.; Wang, N.
2016-01-01
In order to better express the characteristics of satellite clock bias (SCB) and improve SCB prediction precision, this paper proposed a new SCB prediction model which can take physical characteristics of space-borne atomic clock, the cyclic variation, and random part of SCB into consideration. First, the new model employs a quadratic polynomial model with periodic items to fit and extract the trend term and cyclic term of SCB; then based on the characteristics of fitting residuals, a time series ARIMA ~(Auto-Regressive Integrated Moving Average) model is used to model the residuals; eventually, the results from the two models are combined to obtain final SCB prediction values. At last, this paper uses precise SCB data from IGS (International GNSS Service) to conduct prediction tests, and the results show that the proposed model is effective and has better prediction performance compared with the quadratic polynomial model, grey model, and ARIMA model. In addition, the new method can also overcome the insufficiency of the ARIMA model in model recognition and order determination.
Tánczos, Tímea; Zádori, Dénes; Jakab, Katalin; Hnyilicza, Zsuzsanna; Klivényi, Péter; Keresztes, László; Engelhardt, József; Németh, Dezső; Vécsei, László
2014-01-01
Lightning-related injuries most often involve impairment of the functions of the central and peripheral nervous systems, usually including cognitive dysfunctions. We evaluated the cognitive deficit of a patient who had survived a lightning strike and measured the improvement after her cognitive training. This therapeutic method appears to be a powerful tool in the neurorehabilitation treatment. The aim of this case study was to prove the beneficial effects of cognitive training as part of the neurorehabilitation after a lightning strike. Six neuropsychological functions were examined in order to test the cognitive status of the patient before and after the 2-month cognitive training: phonological short-term memory (digit span test and word repetitions test), visuo-spatial short-term memory (Corsi Block Tapping Test), working memory (backward digit span test and listening span test), executive functions (letter and semantic fluencies), language functions (non-word repetition test, Pléh-Palotás-Lörik (PPL) test and sentence repetition test) and episodic memory (Rivermead Behavioral Memory Test and Mini Mental State Examination). We also utilized these tests in aged-matched healthy individuals so as to be able to characterize the domains of the observed improvements more precisely. The patient exhibited a considerable improvement in the backward digit span, semantic fluency, non-word repetition, PPL, sentence repetition and Rivermead Behavioral Memory tests. The cognitive training played an important role in the neurorehabilitation treatment of this lightning injury patient. It considerably improved her quality of life through the functional recovery.
NASA Astrophysics Data System (ADS)
Various papers on the mechanical technology of inertial devices are presented. The topics addressed include: development of a directional gyroscope for remotely piloted vehicles and similar applications; a two-degree-of-freedom gyroscope with frictionless inner and outer gimbal pickoffs; oscillogyro design, manufacture, and performance; development of miniature two-axis rate gyroscope; mechanical design aspects of the electrostatically suspended gyroscope; role of gas-lubricated bearings in current and future sensors; development of a new microporous retainer material for precision ball bearings; design study for a high-stability, large-centrifuge test bed; evaluation of a two-axis rate gyro; operating principles of a two-axis angular rate transducer; and nutation frequency analysis. Also considered are: triaxial laser gyro; mechanical design considerations for a ring laser gyro dither mechanism; environmental considerations in the design of fiberoptic gyroscopes; manufacturing aspects of some critical high-precision mechanical components of inertial devices; dynamics and control of a gyroscopic force measurement system; high precision and high performance motion systems; use of multiple acceleration references to obtain high precision centrifuge data at low cost; gyro testing and evaluation at the Communications Research Centre; review of the mechanical design and development of a high-performance accelerometer; and silicon microengineering for accelerometers.
Electrotactile EMG feedback improves the control of prosthesis grasping force
NASA Astrophysics Data System (ADS)
Schweisfurth, Meike A.; Markovic, Marko; Dosen, Strahinja; Teich, Florian; Graimann, Bernhard; Farina, Dario
2016-10-01
Objective. A drawback of active prostheses is that they detach the subject from the produced forces, thereby preventing direct mechanical feedback. This can be compensated by providing somatosensory feedback to the user through mechanical or electrical stimulation, which in turn may improve the utility, sense of embodiment, and thereby increase the acceptance rate. Approach. In this study, we compared a novel approach to closing the loop, namely EMG feedback (emgFB), to classic force feedback (forceFB), using electrotactile interface in a realistic task setup. Eleven intact-bodied subjects and one transradial amputee performed a routine grasping task while receiving emgFB or forceFB. The two feedback types were delivered through the same electrotactile interface, using a mixed spatial/frequency coding to transmit 8 discrete levels of the feedback variable. In emgFB, the stimulation transmitted the amplitude of the processed myoelectric signal generated by the subject (prosthesis input), and in forceFB the generated grasping force (prosthesis output). The task comprised 150 trials of routine grasping at six forces, randomly presented in blocks of five trials (same force). Interquartile range and changes in the absolute error (AE) distribution (magnitude and dispersion) with respect to the target level were used to assess precision and overall performance, respectively. Main results. Relative to forceFB, emgFB significantly improved the precision of myoelectric commands (min/max of the significant levels) for 23%/36% as well as the precision of force control for 12%/32%, in intact-bodied subjects. Also, the magnitude and dispersion of the AE distribution were reduced. The results were similar in the amputee, showing considerable improvements. Significance. Using emgFB, the subjects therefore decreased the uncertainty of the forward pathway. Since there is a correspondence between the EMG and force, where the former anticipates the latter, the emgFB allowed for predictive control, as the subjects used the feedback to adjust the desired force even before the prosthesis contacted the object. In conclusion, the online emgFB was superior to the classic forceFB in realistic conditions that included electrotactile stimulation, limited feedback resolution (8 levels), cognitive processing delay, and time constraints (fast grasping).
Duchêne, David; Duchêne, Sebastian; Ho, Simon Y W
2015-07-01
Phylogenetic estimation of evolutionary timescales has become routine in biology, forming the basis of a wide range of evolutionary and ecological studies. However, there are various sources of bias that can affect these estimates. We investigated whether tree imbalance, a property that is commonly observed in phylogenetic trees, can lead to reduced accuracy or precision of phylogenetic timescale estimates. We analysed simulated data sets with calibrations at internal nodes and at the tips, taking into consideration different calibration schemes and levels of tree imbalance. We also investigated the effect of tree imbalance on two empirical data sets: mitogenomes from primates and serial samples of the African swine fever virus. In analyses calibrated using dated, heterochronous tips, we found that tree imbalance had a detrimental impact on precision and produced a bias in which the overall timescale was underestimated. A pronounced effect was observed in analyses with shallow calibrations. The greatest decreases in accuracy usually occurred in the age estimates for medium and deep nodes of the tree. In contrast, analyses calibrated at internal nodes did not display a reduction in estimation accuracy or precision due to tree imbalance. Our results suggest that molecular-clock analyses can be improved by increasing taxon sampling, with the specific aims of including deeper calibrations, breaking up long branches and reducing tree imbalance. © 2014 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu Wei; Low, Daniel A.; Parikh, Parag J.
2005-07-15
An important consideration in four-dimensional CT scanning is the selection of a breathing metric for sorting the CT data and modeling internal motion. This study compared two noninvasive breathing metrics, spirometry and abdominal height, against internal air content, used as a surrogate for internal motion. Both metrics were shown to be accurate, but the spirometry showed a stronger and more reproducible relationship than the abdominal height in the lung. The abdominal height was known to be affected by sensor placement and patient positioning while the spirometer exhibited signal drift. By combining these two, a normalization of the drift-free metric tomore » tidal volume may be generated and the overall metric precision may be improved.« less
NASA Technical Reports Server (NTRS)
Crisp, David
2008-01-01
The Orbiting Carbon Observatory (OCO) and the Greenhouse Gases Observing Satellite (GOSAT) are the first two satellites designed to make global measurements of atmospheric carbon dioxide (CO2) with the precision and sampling needed identify and monitor surface sources and sinks of this important greenhouse gas. Because the operational phases of the OCO and GOSAT missions overlap in time, there are numerous opportunities for comparing and combining the data from these two satellites to improve our understanding of the natural processes and human activities that control the atmospheric CO2 and it variability over time. Opportunities for cross-calibration, cross-validation, and coordinated observations that are currently under consideration are summarized here.
ESA's satellite communications programme
NASA Astrophysics Data System (ADS)
Bartholome, P.
1985-02-01
The developmental history, current status, and future plans of the ESA satellite-communications programs are discussed in a general survey and illustrated with network diagrams and maps. Consideration is given to the parallel development of national and European direct-broadcast systems and telecommunications networks, the position of the European space and electronics industries in the growing world market, the impact of technological improvements (both in satellite systems and in ground-based networks), and the technological and commercial advantages of integrated space-terrestrial networks. The needs for a European definition of the precise national and international roles of satellite communications, for maximum speed in implementing such decisions (before the technology becomes obsolete), and for increased cooperation and standardization to assure European equipment manufacturers a reasonable share of the market are stressed.
NASA Astrophysics Data System (ADS)
Gehrmann-De Ridder, A.; Gehrmann, T.; Glover, E. W. N.; Huss, A.; Walker, D. M.
2018-03-01
The transverse momentum spectra of weak gauge bosons and their ratios probe the underlying dynamics and are crucial in testing our understanding of the standard model. They are an essential ingredient in precision measurements, such as the W boson mass extraction. To fully exploit the potential of the LHC data, we compute the second-order [next-to-next-to-leading-order (NNLO)] QCD corrections to the inclusive-pTW spectrum as well as to the ratios of spectra for W-/W+ and Z /W . We find that the inclusion of NNLO QCD corrections considerably improves the theoretical description of the experimental CMS data and results in a substantial reduction of the residual scale uncertainties.
NASA Astrophysics Data System (ADS)
Kirillin, M. Yu; Priezzhev, A. V.; Hast, J.; Myllylä, Risto
2006-02-01
Signals of an optical coherence tomograph from paper samples are calculated by the Monte Carlo method before and after the action of different immersion liquids such as ethanol, glycerol, benzyl alcohol, and 1-pentanol. It is shown within the framework of the model used that all these liquids reduce the contrast of the inhomogeneity image in upper layers of the samples, considerably improving, however, the visibility of lower layers, allowing the localisation of the rear boundary of a medium being probed, which is important for precision contactless measuring a paper sheet thickness, for example, during the manufacturing process. The results of calculations are in well agreement with experimental data.
Genome editing for crop improvement: Challenges and opportunities
Abdallah, Naglaa A; Prakash, Channapatna S; McHughen, Alan G
2015-01-01
ABSTRACT Genome or gene editing includes several new techniques to help scientists precisely modify genome sequences. The techniques also enables us to alter the regulation of gene expression patterns in a pre-determined region and facilitates novel insights into the functional genomics of an organism. Emergence of genome editing has brought considerable excitement especially among agricultural scientists because of its simplicity, precision and power as it offers new opportunities to develop improved crop varieties with clear-cut addition of valuable traits or removal of undesirable traits. Research is underway to improve crop varieties with higher yields, strengthen stress tolerance, disease and pest resistance, decrease input costs, and increase nutritional value. Genome editing encompasses a wide variety of tools using either a site-specific recombinase (SSR) or a site-specific nuclease (SSN) system. Both systems require recognition of a known sequence. The SSN system generates single or double strand DNA breaks and activates endogenous DNA repair pathways. SSR technology, such as Cre/loxP and Flp/FRT mediated systems, are able to knockdown or knock-in genes in the genome of eukaryotes, depending on the orientation of the specific sites (loxP, FLP, etc.) flanking the target site. There are 4 main classes of SSN developed to cleave genomic sequences, mega-nucleases (homing endonuclease), zinc finger nucleases (ZFNs), transcriptional activator-like effector nucleases (TALENs), and the CRISPR/Cas nuclease system (clustered regularly interspaced short palindromic repeat/CRISPR-associated protein). The recombinase mediated genome engineering depends on recombinase (sub-) family and target-site and induces high frequencies of homologous recombination. Improving crops with gene editing provides a range of options: by altering only a few nucleotides from billions found in the genomes of living cells, altering the full allele or by inserting a new gene in a targeted region of the genome. Due to its precision, gene editing is more precise than either conventional crop breeding methods or standard genetic engineering methods. Thus this technology is a very powerful tool that can be used toward securing the world's food supply. In addition to improving the nutritional value of crops, it is the most effective way to produce crops that can resist pests and thrive in tough climates. There are 3 types of modifications produced by genome editing; Type I includes altering a few nucleotides, Type II involves replacing an allele with a pre-existing one and Type III allows for the insertion of new gene(s) in predetermined regions in the genome. Because most genome-editing techniques can leave behind traces of DNA alterations evident in a small number of nucleotides, crops created through gene editing could avoid the stringent regulation procedures commonly associated with GM crop development. For this reason many scientists believe plants improved with the more precise gene editing techniques will be more acceptable to the public than transgenic plants. With genome editing comes the promise of new crops being developed more rapidly with a very low risk of off-target effects. It can be performed in any laboratory with any crop, even those that have complex genomes and are not easily bred using conventional methods. PMID:26930114
Precise GPS ephemerides from DMA and NGS tested by time transfer
NASA Technical Reports Server (NTRS)
Lewandowski, Wlodzimierz W.; Petit, Gerard; Thomas, Claudine
1992-01-01
It was shown that the use of the Defense Mapping Agency's (DMA) precise ephemerides brings a significant improvement to the accuracy of GPS time transfer. At present a new set of precise ephemerides produced by the National Geodetic Survey (NGS) has been made available to the timing community. This study demonstrates that both types of precise ephemerides improve long-distance GPS time transfer and remove the effects of Selective Availability (SA) degradation of broadcast ephemerides. The issue of overcoming SA is also discussed in terms of the routine availability of precise ephemerides.
Genetics Home Reference: pseudohypoaldosteronism type 1
... structural and functional considerations and new perspectives. Mol Cell Endocrinol. 2012 Mar 24;350(2):206-15. doi: ... are genome editing and CRISPR-Cas9? What is precision medicine? What is newborn ...
A Review of Issues Related to Data Acquisition and Analysis in EEG/MEG Studies.
Puce, Aina; Hämäläinen, Matti S
2017-05-31
Electroencephalography (EEG) and magnetoencephalography (MEG) are non-invasive electrophysiological methods, which record electric potentials and magnetic fields due to electric currents in synchronously-active neurons. With MEG being more sensitive to neural activity from tangential currents and EEG being able to detect both radial and tangential sources, the two methods are complementary. Over the years, neurophysiological studies have changed considerably: high-density recordings are becoming de rigueur; there is interest in both spontaneous and evoked activity; and sophisticated artifact detection and removal methods are available. Improved head models for source estimation have also increased the precision of the current estimates, particularly for EEG and combined EEG/MEG. Because of their complementarity, more investigators are beginning to perform simultaneous EEG/MEG studies to gain more complete information about neural activity. Given the increase in methodological complexity in EEG/MEG, it is important to gather data that are of high quality and that are as artifact free as possible. Here, we discuss some issues in data acquisition and analysis of EEG and MEG data. Practical considerations for different types of EEG and MEG studies are also discussed.
Gobinath, Aarthi R; Choleris, Elena; Galea, Liisa A M
2017-01-02
Sex differences exist in the vulnerability, incidence, manifestation, and treatment of numerous neurological and psychiatric diseases. Despite this observation prominent in the literature, little consideration has been given to possible sex differences in outcome in both preclinical and clinical research. This Mini-Review highlights evidence supporting why studying sex differences matter for advances in brain health as well as improving treatment for neurological and psychiatric disease. Additionally, we discuss some statistical and methodological considerations in evaluating sex differences as well as how differences in the physiology of the sexes can contribute to sex difference in disease incidence and manifestation. Furthermore, we review literature demonstrating that the reproductive experience in the female can render the female brain differentially vulnerable to disease across age. Finally, we discuss how genes interact with sex to influence disease risk and treatment and argue that sex must be considered in precision medicine. Together the evidence reviewed here supports the inclusion of males and females at all levels of neuroscience research. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Monna, F.; Loizeau, J.-L.; Thomas, B. A.; Guéguen, C.; Favarger, P.-Y.
1998-08-01
One of the factors limiting the precision of inductively coupled plasma mass spectrometry is the counting statistics, which depend upon acquisition time and ion fluxes. In the present study, the precision of the isotopic measurements of Pb and Sr is examined. The time of measurement is optimally shared for each isotope, using a mathematical simulation, to provide the lowest theoretical analytical error. Different algorithms of mass bias correction are also taken into account and evaluated in term of improvement of overall precision. Several experiments allow a comparison of real conditions with theory. The present method significantly improves the precision, regardless of the instrument used. However, this benefit is more important for equipment which originally yields a precision close to that predicted by counting statistics. Additionally, the procedure is flexible enough to be easily adapted to other problems, such as isotopic dilution.
Tigges, Jan; Lakes, Tobia
2017-10-04
Urban forests reduce greenhouse gas emissions by storing and sequestering considerable amounts of carbon. However, few studies have considered the local scale of urban forests to effectively evaluate their potential long-term carbon offset. The lack of precise, consistent and up-to-date forest details is challenging for long-term prognoses. Therefore, this review aims to identify uncertainties in urban forest carbon offset assessment and discuss the extent to which such uncertainties can be reduced by recent progress in high resolution remote sensing. We do this by performing an extensive literature review and a case study combining remote sensing and life cycle assessment of urban forest carbon offset in Berlin, Germany. Recent progress in high resolution remote sensing and methods is adequate for delivering more precise details on the urban tree canopy, individual tree metrics, species, and age structures compared to conventional land use/cover class approaches. These area-wide consistent details can update life cycle inventories for more precise future prognoses. Additional improvements in classification accuracy can be achieved by a higher number of features derived from remote sensing data of increasing resolution, but first studies on this subject indicated that a smart selection of features already provides sufficient data that avoids redundancies and enables more efficient data processing. Our case study from Berlin could use remotely sensed individual tree species as consistent inventory of a life cycle assessment. However, a lack of growth, mortality and planting data forced us to make assumptions, therefore creating uncertainty in the long-term prognoses. Regarding temporal changes and reliable long-term estimates, more attention is required to detect changes of gradual growth, pruning and abrupt changes in tree planting and mortality. As such, precise long-term urban ecological monitoring using high resolution remote sensing should be intensified, especially due to increasing climate change effects. This is important for calibrating and validating recent prognoses of urban forest carbon offset, which have so far scarcely addressed longer timeframes. Additionally, higher resolution remote sensing of urban forest carbon estimates can improve upscaling approaches, which should be extended to reach a more precise global estimate for the first time. Urban forest carbon offset can be made more relevant by making more standardized assessments available for science and professional practitioners, and the increasing availability of high resolution remote sensing data and the progress in data processing allows for precisely that.
Scent Lure Effect on Camera-Trap Based Leopard Density Estimates
Braczkowski, Alexander Richard; Balme, Guy Andrew; Dickman, Amy; Fattebert, Julien; Johnson, Paul; Dickerson, Tristan; Macdonald, David Whyte; Hunter, Luke
2016-01-01
Density estimates for large carnivores derived from camera surveys often have wide confidence intervals due to low detection rates. Such estimates are of limited value to authorities, which require precise population estimates to inform conservation strategies. Using lures can potentially increase detection, improving the precision of estimates. However, by altering the spatio-temporal patterning of individuals across the camera array, lures may violate closure, a fundamental assumption of capture-recapture. Here, we test the effect of scent lures on the precision and veracity of density estimates derived from camera-trap surveys of a protected African leopard population. We undertook two surveys (a ‘control’ and ‘treatment’ survey) on Phinda Game Reserve, South Africa. Survey design remained consistent except a scent lure was applied at camera-trap stations during the treatment survey. Lures did not affect the maximum movement distances (p = 0.96) or temporal activity of female (p = 0.12) or male leopards (p = 0.79), and the assumption of geographic closure was met for both surveys (p >0.05). The numbers of photographic captures were also similar for control and treatment surveys (p = 0.90). Accordingly, density estimates were comparable between surveys (although estimates derived using non-spatial methods (7.28–9.28 leopards/100km2) were considerably higher than estimates from spatially-explicit methods (3.40–3.65 leopards/100km2). The precision of estimates from the control and treatment surveys, were also comparable and this applied to both non-spatial and spatial methods of estimation. Our findings suggest that at least in the context of leopard research in productive habitats, the use of lures is not warranted. PMID:27050816
A field ornithologist’s guide to genomics: Practical considerations for ecology and conservation
Oyler-McCance, Sara J.; Oh, Kevin; Langin, Kathryn; Aldridge, Cameron L.
2016-01-01
Vast improvements in sequencing technology have made it practical to simultaneously sequence millions of nucleotides distributed across the genome, opening the door for genomic studies in virtually any species. Ornithological research stands to benefit in three substantial ways. First, genomic methods enhance our ability to parse and simultaneously analyze both neutral and non-neutral genomic regions, thus providing insight into adaptive evolution and divergence. Second, the sheer quantity of sequence data generated by current sequencing platforms allows increased precision and resolution in analyses. Third, high-throughput sequencing can benefit applications that focus on a small number of loci that are otherwise prohibitively expensive, time-consuming, and technically difficult using traditional sequencing methods. These advances have improved our ability to understand evolutionary processes like speciation and local adaptation, but they also offer many practical applications in the fields of population ecology, migration tracking, conservation planning, diet analyses, and disease ecology. This review provides a guide for field ornithologists interested in incorporating genomic approaches into their research program, with an emphasis on techniques related to ecology and conservation. We present a general overview of contemporary genomic approaches and methods, as well as important considerations when selecting a genomic technique. We also discuss research questions that are likely to benefit from utilizing high-throughput sequencing instruments, highlighting select examples from recent avian studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majhi, S.K., E-mail: tpskm@iacs.res.in; Mukhopadhyay, A., E-mail: aditi_mukhopadhyay@baylor.edu; Ward, B.F.L., E-mail: bfl_ward@baylor.edu
2014-11-15
We present a phenomenological study of the current status of the application of our approach of exact amplitude-based resummation in quantum field theory to precision QCD calculations, by realistic MC event generator methods, as needed for precision LHC physics. We discuss recent results as they relate to the interplay of the attendant IR-improved DGLAP-CS theory of one of us and the precision of exact NLO matrix-element matched parton shower MC’s in the Herwig6.5 environment as determined by comparison to recent LHC experimental observations on single heavy gauge boson production and decay. The level of agreement between the new theory andmore » the data continues to be a reason for optimism. In the spirit of completeness, we discuss as well other approaches to the same theoretical predictions that we make here from the standpoint of physical precision with an eye toward the (sub-)1% QCD⊗EW total theoretical precision regime for LHC physics. - Highlights: • Using LHC data, we show that IR-improved DGLAP-CS kernels with exact NLO Shower/ME matching improves MC precision. • We discuss other possible approaches in comparison with ours. • We propose experimental tests to discriminate between competing approaches.« less
Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.
2012-01-01
Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against those risks. The analysis framework presented here will be useful for other species exhibiting heterogeneity by detection method.
Software designs of image processing tasks with incremental refinement of computation.
Anastasia, Davide; Andreopoulos, Yiannis
2010-08-01
Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.
[Research on fast implementation method of image Gaussian RBF interpolation based on CUDA].
Chen, Hao; Yu, Haizhong
2014-04-01
Image interpolation is often required during medical image processing and analysis. Although interpolation method based on Gaussian radial basis function (GRBF) has high precision, the long calculation time still limits its application in field of image interpolation. To overcome this problem, a method of two-dimensional and three-dimensional medical image GRBF interpolation based on computing unified device architecture (CUDA) is proposed in this paper. According to single instruction multiple threads (SIMT) executive model of CUDA, various optimizing measures such as coalesced access and shared memory are adopted in this study. To eliminate the edge distortion of image interpolation, natural suture algorithm is utilized in overlapping regions while adopting data space strategy of separating 2D images into blocks or dividing 3D images into sub-volumes. Keeping a high interpolation precision, the 2D and 3D medical image GRBF interpolation achieved great acceleration in each basic computing step. The experiments showed that the operative efficiency of image GRBF interpolation based on CUDA platform was obviously improved compared with CPU calculation. The present method is of a considerable reference value in the application field of image interpolation.
Regenbogen, Sam; Wilkins, Angela D; Lichtarge, Olivier
2016-01-01
Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses.
REGENBOGEN, SAM; WILKINS, ANGELA D.; LICHTARGE, OLIVIER
2015-01-01
Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses. PMID:26776170
Hatfield, Laura A.; Gutreuter, Steve; Boogaard, Michael A.; Carlin, Bradley P.
2011-01-01
Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data.
Synthetic Vision Systems - Operational Considerations Simulation Experiment
NASA Technical Reports Server (NTRS)
Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.
2007-01-01
Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents/accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.
Synthetic vision systems: operational considerations simulation experiment
NASA Astrophysics Data System (ADS)
Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.
2007-04-01
Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents / accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.
Comprehensive benchmarking of SNV callers for highly admixed tumor data
Bohnert, Regina; Vivas, Sonia
2017-01-01
Precision medicine attempts to individualize cancer therapy by matching tumor-specific genetic changes with effective targeted therapies. A crucial first step in this process is the reliable identification of cancer-relevant variants, which is considerably complicated by the impurity and heterogeneity of clinical tumor samples. We compared the impact of admixture of non-cancerous cells and low somatic allele frequencies on the sensitivity and precision of 19 state-of-the-art SNV callers. We studied both whole exome and targeted gene panel data and up to 13 distinct parameter configurations for each tool. We found vast differences among callers. Based on our comprehensive analyses we recommend joint tumor-normal calling with MuTect, EBCall or Strelka for whole exome somatic variant calling, and HaplotypeCaller or FreeBayes for whole exome germline calling. For targeted gene panel data on a single tumor sample, LoFreqStar performed best. We further found that tumor impurity and admixture had a negative impact on precision, and in particular, sensitivity in whole exome experiments. At admixture levels of 60% to 90% sometimes seen in pathological biopsies, sensitivity dropped significantly, even when variants were originally present in the tumor at 100% allele frequency. Sensitivity to low-frequency SNVs improved with targeted panel data, but whole exome data allowed more efficient identification of germline variants. Effective somatic variant calling requires high-quality pathological samples with minimal admixture, a consciously selected sequencing strategy, and the appropriate variant calling tool with settings optimized for the chosen type of data. PMID:29020110
NASA Astrophysics Data System (ADS)
Lu, Hong; Gargesha, Madhusudhana; Wang, Zhao; Chamie, Daniel; Attizani, Guilherme F.; Kanaya, Tomoaki; Ray, Soumya; Costa, Marco A.; Rollins, Andrew M.; Bezerra, Hiram G.; Wilson, David L.
2013-02-01
Intravascular OCT (iOCT) is an imaging modality with ideal resolution and contrast to provide accurate in vivo assessments of tissue healing following stent implantation. Our Cardiovascular Imaging Core Laboratory has served >20 international stent clinical trials with >2000 stents analyzed. Each stent requires 6-16hrs of manual analysis time and we are developing highly automated software to reduce this extreme effort. Using classification technique, physically meaningful image features, forward feature selection to limit overtraining, and leave-one-stent-out cross validation, we detected stent struts. To determine tissue coverage areas, we estimated stent "contours" by fitting detected struts and interpolation points from linearly interpolated tissue depths to a periodic cubic spline. Tissue coverage area was obtained by subtracting lumen area from the stent area. Detection was compared against manual analysis of 40 pullbacks. We obtained recall = 90+/-3% and precision = 89+/-6%. When taking struts deemed not bright enough for manual analysis into consideration, precision improved to 94+/-6%. This approached inter-observer variability (recall = 93%, precision = 96%). Differences in stent and tissue coverage areas are 0.12 +/- 0.41 mm2 and 0.09 +/- 0.42 mm2, respectively. We are developing software which will enable visualization, review, and editing of automated results, so as to provide a comprehensive stent analysis package. This should enable better and cheaper stent clinical trials, so that manufacturers can optimize the myriad of parameters (drug, coverage, bioresorbable versus metal, etc.) for stent design.
An Ultra-Precise Method for the Nano Thin-Film Removal
NASA Astrophysics Data System (ADS)
Pa, P. S.
In this research an electrode-set is used to investigate via an ultra-precise method for the removal of Indium Tin Oxide (ITO) thin-film microstructure from defective display panels to conquer the low yield rate in display panel production as to from imperfect Indium Tin Oxide layer deposition is well known. This process, which involves the removal of ITO layer substructure by means of an electrochemical removal (ECMR), is of major interest to the optoelectronics semiconductor industry. In this electro machining process a high current flow and high feed rate of the display (color filter) achieves complete and efficient removal of the ITO layer. The ITO thin-film can be removed completely by a proper combination of feed rate and electric power. A small gap between the diameter cathode virtual rotation circle and the diameter virtual rotation circle also corresponds to a higher removal rate. A small anode edge radius with a small cathode edge radius effectively improves dregs discharge and is an advantage when associated with a high workpiece feed rate. This precision method for the recycling of defective display screen color filters is presented as an effective tool for use in the screen manufacturing process. The defective Indium Tin Oxide thin-film can be removed easily and cleanly in a short time. The complete removal of the ITO layer makes it possible to put these panels back into the production line for reuse with a considerable reduction of both waste and production cost.
NASA Technical Reports Server (NTRS)
Gerdes, R. M.
1980-01-01
A series of simulation and flight investigations were undertaken to evaluate helicopter flying qualities and the effects of control system augmentation for nap-of-the-Earth (NOE) agility and instrument flying tasks. Handling quality factors common to both tasks were identified. Precise attitude control was determined to be a key requirement for successful accomplishment of both tasks. Factors that degraded attitude controllability were improper levels of control sensitivity and damping, and rotor system cross coupling due to helicopter angular rate and collective pitch input. Application of rate command, attitude command, and control input decouple augmentation schemes enhanced attitude control and significantly improved handling qualities for both tasks. The NOE agility and instrument flying handling quality considerations, pilot rating philosophy, and supplemental flight evaluations are also discussed.
NASA Astrophysics Data System (ADS)
Anisovich, A. V.; Beck, R.; Döring, M.; Gottschall, M.; Hartmann, J.; Kashevarov, V.; Klempt, E.; Meißner, Ulf-G.; Nikonov, V.; Ostrick, M.; Rönchen, D.; Sarantsev, A.; Strakovsky, I.; Thiel, A.; Tiator, L.; Thoma, U.; Workman, R.; Wunderlich, Y.
2016-09-01
New data on pion-photoproduction off the proton have been included in the partial-wave analyses Bonn-Gatchina and SAID and in the dynamical coupled-channel approach Jülich-Bonn. All reproduce the recent new data well: the double-polarization data for E, G, H, P and T in γ p→ π0p from ELSA, the beam asymmetry Σ for γ p→ π0p and π+n from Jefferson Laboratory, and the precise new differential cross section and beam asymmetry data Σ for γ p→ π0p from MAMI. The new fit results for the multipoles are compared with predictions not taking into account the new data. The mutual agreement is improved considerably but still far from being perfect.
Approaches, field considerations and problems associated with radio tracking carnivores
Sargeant, A.B.; Amlaner, C. J.; MacDonald, D.W.
1979-01-01
The adaptation of radio tracking to ecological studies was a major technological advance affecting field investigations of animal movements and behavior. Carnivores have been the recipients of much attention with this new technology and study approaches have varied from simple to complex. Equipment performance has much improved over the years, but users still face many difficulties. The beginning of all radio tracking studies should be a precise definition of objectives. Study objectives dictate type of gear required and field procedures. Field conditions affect equipment performance and investigator ability to gather data. Radio tracking carnivores is demanding and generally requires greater time than anticipated. Problems should be expected and planned for in study design. Radio tracking can be an asset in carnivore studies but caution is needed in its application.
Lunar surface structural concepts and construction studies
NASA Technical Reports Server (NTRS)
Mikulas, Martin
1991-01-01
The topics are presented in viewgraph form and include the following: lunar surface structures construction research areas; lunar crane related disciplines; shortcomings of typical mobile crane in lunar base applications; candidate crane cable suspension systems; NIST six-cable suspension crane; numerical example of natural frequency; the incorporation of two new features for improved performance of the counter-balanced actively-controlled lunar crane; lunar crane pendulum mechanics; simulation results; 1/6 scale lunar crane testbed using GE robot for global manipulation; basic deployable truss approaches; bi-pantograph elevator platform; comparison of elevator platforms; perspective of bi-pantograph beam; bi-pantograph synchronously deployable tower/beam; lunar module off-loading concept; module off-loader concept packaged; starburst deployable precision reflector; 3-ring reflector deployment scheme; cross-section of packaged starburst reflector; and focal point and thickness packaging considerations.
Floating point arithmetic in future supercomputers
NASA Technical Reports Server (NTRS)
Bailey, David H.; Barton, John T.; Simon, Horst D.; Fouts, Martin J.
1989-01-01
Considerations in the floating-point design of a supercomputer are discussed. Particular attention is given to word size, hardware support for extended precision, format, and accuracy characteristics. These issues are discussed from the perspective of the Numerical Aerodynamic Simulation Systems Division at NASA Ames. The features believed to be most important for a future supercomputer floating-point design include: (1) a 64-bit IEEE floating-point format with 11 exponent bits, 52 mantissa bits, and one sign bit and (2) hardware support for reasonably fast double-precision arithmetic.
Improving Weather Forecasts Through Reduced Precision Data Assimilation
NASA Astrophysics Data System (ADS)
Hatfield, Samuel; Düben, Peter; Palmer, Tim
2017-04-01
We present a new approach for improving the efficiency of data assimilation, by trading numerical precision for computational speed. Future supercomputers will allow a greater choice of precision, so that models can use a level of precision that is commensurate with the model uncertainty. Previous studies have already indicated that the quality of climate and weather forecasts is not significantly degraded when using a precision less than double precision [1,2], but so far these studies have not considered data assimilation. Data assimilation is inherently uncertain due to the use of relatively long assimilation windows, noisy observations and imperfect models. Thus, the larger rounding errors incurred from reducing precision may be within the tolerance of the system. Lower precision arithmetic is cheaper, and so by reducing precision in ensemble data assimilation, we can redistribute computational resources towards, for example, a larger ensemble size. Because larger ensembles provide a better estimate of the underlying distribution and are less reliant on covariance inflation and localisation, lowering precision could actually allow us to improve the accuracy of weather forecasts. We will present results on how lowering numerical precision affects the performance of an ensemble data assimilation system, consisting of the Lorenz '96 toy atmospheric model and the ensemble square root filter. We run the system at half precision (using an emulation tool), and compare the results with simulations at single and double precision. We estimate that half precision assimilation with a larger ensemble can reduce assimilation error by 30%, with respect to double precision assimilation with a smaller ensemble, for no extra computational cost. This results in around half a day extra of skillful weather forecasts, if the error-doubling characteristics of the Lorenz '96 model are mapped to those of the real atmosphere. Additionally, we investigate the sensitivity of these results to observational error and assimilation window length. Half precision hardware will become available very shortly, with the introduction of Nvidia's Pascal GPU architecture and the Intel Knights Mill coprocessor. We hope that the results presented here will encourage the uptake of this hardware. References [1] Peter D. Düben and T. N. Palmer, 2014: Benchmark Tests for Numerical Weather Forecasts on Inexact Hardware, Mon. Weather Rev., 142, 3809-3829 [2] Peter D. Düben, Hugh McNamara and T. N. Palmer, 2014: The use of imprecise processing to improve accuracy in weather & climate prediction, J. Comput. Phys., 271, 2-18
Bio-inspired photo-electronic material based on photosynthetic proteins
NASA Astrophysics Data System (ADS)
Lebedev, Nikolai; Trammell, Scott A.; Tsoi, Stanislav; Spano, Anthony; Kim, Jin Ho; Xu, Jimmy; Twigg, Mark E.; Schnur, Joel M.
2009-08-01
The construction of efficient light energy converting (photovoltaic and photo-electronic) devices is a current and great challenge in science and technology and one that will have important economic consequences. Several innovative nanoelectronic materials were proposed to achieve this goal, semiconductor quantum dots, metallic nanowires and carbon nanotubes (CNT) are among them. As a charge separating unit for light energy conversion, we propose the utilization of the most advanced photoelectronic material developed by nature, photosynthetic reaction center proteins. As a first step in this direction, we constructed a novel bioinorganic nanophotoelectronic material with photoactive photosynthetic reaction center (RC) proteins encapsulated inside a multiwall CNT arrayed electrode. The material consists of photosynthetic RC-cytochrome complexes acting as charge separating units bound to the inner walls of a CNT electrode and ubiquinone-10 (Q2) serving as a soluble electron-transfer mediator to the counter electrode. The proteins were immobilized inside carbon nanotubes by a Ni(NTA)-alkane-pyrene linker, forming a self-assembled monolayer (SAM) on the surface of inner CNT walls and allowing for unidirectional protein orientation. The material demonstrates an enhanced photoinduced electron transfer rate and shows substantial improvement in photocurrent density compared to that obtained with the same proteins when immobilized on planar graphite (HOPG) electrode. The results suggest that protein encapsulation in precisely organized arrayed tubular electrode architecture can considerably improve the performance of photovoltaic, photoelectronic, or biofuel cell devices. They demonstrate the potential for substantial advantages of precisely organized nano electrode tubular arrayed architecture for variety biotechnological applications.
Development of an Indicator to Monitor Mediterranean Wetlands
Sanchez, Antonio; Abdul Malak, Dania; Guelmami, Anis; Perennou, Christian
2015-01-01
Wetlands are sensitive ecosystems that are increasingly subjected to threats from anthropogenic factors. In the last decades, coastal Mediterranean wetlands have been suffering considerable pressures from land use change, intensification of urban growth, increasing tourism infrastructure and intensification of agricultural practices. Remote sensing (RS) and Geographic Information Systems (GIS) techniques are efficient tools that can support monitoring Mediterranean coastal wetlands on large scales and over long periods of time. The study aims at developing a wetland indicator to support monitoring Mediterranean coastal wetlands using these techniques. The indicator makes use of multi-temporal Landsat images, land use reference layers, a 50m numerical model of the territory (NMT) and Corine Land Cover (CLC) for the identification and mapping of wetlands. The approach combines supervised image classification techniques making use of vegetation indices and decision tree analysis to identify the surface covered by wetlands at a given date. A validation process is put in place to compare outcomes with existing local wetland inventories to check the results reliability. The indicator´s results demonstrate an improvement in the level of precision of change detection methods achieved by traditional tools providing reliability up to 95% in main wetland areas. The results confirm that the use of RS techniques improves the precision of wetland detection compared to the use of CLC for wetland monitoring and stress the strong relation between the level of wetland detection and the nature of the wetland areas and the monitoring scale considered. PMID:25826210
Development of an indicator to monitor mediterranean wetlands.
Sanchez, Antonio; Abdul Malak, Dania; Guelmami, Anis; Perennou, Christian
2015-01-01
Wetlands are sensitive ecosystems that are increasingly subjected to threats from anthropogenic factors. In the last decades, coastal Mediterranean wetlands have been suffering considerable pressures from land use change, intensification of urban growth, increasing tourism infrastructure and intensification of agricultural practices. Remote sensing (RS) and Geographic Information Systems (GIS) techniques are efficient tools that can support monitoring Mediterranean coastal wetlands on large scales and over long periods of time. The study aims at developing a wetland indicator to support monitoring Mediterranean coastal wetlands using these techniques. The indicator makes use of multi-temporal Landsat images, land use reference layers, a 50m numerical model of the territory (NMT) and Corine Land Cover (CLC) for the identification and mapping of wetlands. The approach combines supervised image classification techniques making use of vegetation indices and decision tree analysis to identify the surface covered by wetlands at a given date. A validation process is put in place to compare outcomes with existing local wetland inventories to check the results reliability. The indicator´s results demonstrate an improvement in the level of precision of change detection methods achieved by traditional tools providing reliability up to 95% in main wetland areas. The results confirm that the use of RS techniques improves the precision of wetland detection compared to the use of CLC for wetland monitoring and stress the strong relation between the level of wetland detection and the nature of the wetland areas and the monitoring scale considered.
NASA Technical Reports Server (NTRS)
Yamada, Toshishige; Saini, Subhash (Technical Monitor)
1998-01-01
Adatom chains, precise structures artificially created on an atomically regulated surface, are the smallest possible candidates for future nanoelectronics. Since all the devices are created by combining adatom chains precisely prepared with atomic precision, device characteristics are predictable, and free from deviations due to accidental structural defects. In this atomic dimension, however, an analogy to the current semiconductor devices may not work. For example, Si structures are not always semiconducting. Adatom states do not always localize at the substrate surface when adatoms form chemical bonds to the substrate atoms. Transport properties are often determined for the entire system of the chain and electrodes, and not for chains only. These fundamental issues are discussed, which will be useful for future device considerations.
Design considerations for a space-borne ocean surface laser altimeter
NASA Technical Reports Server (NTRS)
Plotkin, H. H.
1972-01-01
Design procedures for using laser ranging systems in spacecraft to reflect ocean surface pulses vertically and measure spacecraft altitude with high precision are examined. Operating principles and performance experience of a prototype system are given.
TEACHING PHYSICS: A computer-based revitalization of Atwood's machine
NASA Astrophysics Data System (ADS)
Trumper, Ricardo; Gelbman, Moshe
2000-09-01
Atwood's machine is used in a microcomputer-based experiment to demonstrate Newton's second law with considerable precision. The friction force on the masses and the moment of inertia of the pulley can also be estimated.
Aquacells — Flagellates under long-term microgravity and potential usage for life support systems
NASA Astrophysics Data System (ADS)
Häder, Donat-P.; Richter, Peter R.; Strauch, S. M.; Schuster, M.
2006-09-01
The motile behavior of the unicellular photosynthetic flagellate Euglena gracilis was studied during a two-week mission on the Russian satellite Foton M2. The precision of gravitactic orientation was high before launch and, as expected, the cells were unoriented during microgravity. While after previous short-term TEXUS flights the precision of orientation was as high as before launch, it took several hours for the organisms to regain their gravitaxis. Also the percentage of motile cells and the swimming velocity of the remaining motile cells were considerably lower than in the ground control. In preparatory experiments the flagellate Euglena was shown to produce considerable amounts of photosynthetically generated oxygen. In a coupling experiment in a prototype for a planned space mission on Foton M3, the photosynthetic producers were shown to supply sufficient amounts of oxygen to a fish compartment with 35 larval cichlids, Oreochromis mossambicus.
Functionalization of carbon nanotubes: Characterization, modeling and composite applications
NASA Astrophysics Data System (ADS)
Wang, Shiren
Carbon nanotubes have demonstrated exceptional mechanical, thermal and electrical properties, and are regarded as one of the most promising reinforcement materials for the next generation of high performance structural and multifunctional composites. However, to date, most application attempts have been hindered by several technical roadblocks, such as poor dispersion and weak interfacial bonding. In this dissertation, several innovative functionalization methods were proposed, studied to overcome these technical issues in order to realize the full potential of nanotubes as reinforcement. These functionalization methods included precision sectioning of nanotubes using an ultra-microtome, electron-beam irradiation, amino and epoxide group grafting. The characterization results of atomic force microscope, transmission electronic microscope and Raman suggested that aligned carbon nanotubes can be precisely sectioned with controlled length and minimum sidewall damage. This study also designed and demonstrated new covalent functionalization approaches through unique epoxy-grafting and one-step amino-grafting, which have potential of scale-up for composite applications. In addition, the dissertation also successfully tailored the structure and properties of the thin nanotube film through electron beam irradiation. Significant improvement of both mechanical and electrical conducting properties of the irradiated nanotube films or buckypapers was achieved. All these methods demonstrated effectiveness in improving dispersion and interfacial bonding in the epoxy resin, resulting in considerable improvements in composite mechanical properties. Modeling of functionalization methods also provided further understanding and offered the reasonable explanations of SWNTs length distribution as well as carbon nanostructure transformation upon electron-beam irradiation. Both experimental and modeling results provide important foundations for the further comprehensively investigation of nanotube functionalization, and hence facilitate realization of the full potential of nanotube-reinforced nanocomposites.
NASA Astrophysics Data System (ADS)
Hahn, Gitte Holst; Christensen, Karl Bang; Leung, Terence S.; Greisen, Gorm
2010-05-01
Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants. We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GA<30) yielding 215 10-min measurements. Surprisingly, adjusting for variabilityABP within the power spectrum did not improve the precision. However, adjusting for the variabilityABP among repeated measurements (i.e., weighting measurements with high variabilityABP in favor of those with low) improved the precision. The evidence of drift in individual infants was weak. Minimum monitoring time needed to discriminate among infants was 1.3-3.7 h. Coherence analysis in low frequencies (0.04-0.1 Hz) had higher precision and statistically more power than in very low frequencies (0.003-0.04 Hz). In conclusion, a reliable detection of cerebral autoregulation takes hours and the precision is improved by adjusting for variabilityABP between repeated measurements.
An Improved Method of AGM for High Precision Geolocation of SAR Images
NASA Astrophysics Data System (ADS)
Zhou, G.; He, C.; Yue, T.; Huang, W.; Huang, Y.; Li, X.; Chen, Y.
2018-05-01
In order to take full advantage of SAR images, it is necessary to obtain the high precision location of the image. During the geometric correction process of images, to ensure the accuracy of image geometric correction and extract the effective mapping information from the images, precise image geolocation is important. This paper presents an improved analytical geolocation method (IAGM) that determine the high precision geolocation of each pixel in a digital SAR image. This method is based on analytical geolocation method (AGM) proposed by X. K. Yuan aiming at realizing the solution of RD model. Tests will be conducted using RADARSAT-2 SAR image. Comparing the predicted feature geolocation with the position as determined by high precision orthophoto, results indicate an accuracy of 50m is attainable with this method. Error sources will be analyzed and some recommendations about improving image location accuracy in future spaceborne SAR's will be given.
Laser technology for high precision satellite tracking
NASA Technical Reports Server (NTRS)
Plotkin, H. H.
1974-01-01
Fixed and mobile laser ranging stations have been developed to track satellites equipped with retro-reflector arrays. These have operated consistently at data rates of once per second with range precision better than 50 cm, using Q-switched ruby lasers with pulse durations of 20 to 40 nanoseconds. Improvements are being incorporated to improve the precision to 10 cm, and to permit ranging to more distant satellites. These include improved reflector array designs, processing and analysis of the received reflection pulses, and use of sub-nanosecond pulse duration lasers.
Safety and Certification Considerations for Expanding the Use of UAS in Precision Agriculture
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Neogi, Natasha A.; Vertstynen, Harry A.
2016-01-01
The agricultural community is actively engaged in adopting new technologies such as unmanned aircraft systems (UAS) to help assess the condition of crops and develop appropriate treatment plans. In the United States, agricultural use of UAS has largely been limited to small UAS, generally weighing less than 55 lb and operating within the line of sight of a remote pilot. A variety of small UAS are being used to monitor and map crops, while only a few are being used to apply agricultural inputs based on the results of remote sensing. Larger UAS with substantial payload capacity could provide an option for site-specific application of agricultural inputs in a timely fashion, without substantive damage to the crops or soil. A recent study by the National Aeronautics and Space Administration (NASA) investigated certification requirements needed to enable the use of larger UAS to support the precision agriculture industry. This paper provides a brief introduction to aircraft certification relevant to agricultural UAS, an overview of and results from the NASA study, and a discussion of how those results might affect the precision agriculture community. Specific topics of interest include business model considerations for unmanned aerial applicators and a comparison with current means of variable rate application. The intent of the paper is to inform the precision agriculture community of evolving technologies that will enable broader use of unmanned vehicles to reduce costs, reduce environmental impacts, and enhance yield, especially for specialty crops that are grown on small to medium size farms.
The compression–error trade-off for large gridded data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silver, Jeremy D.; Zender, Charles S.
The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less
The compression–error trade-off for large gridded data sets
Silver, Jeremy D.; Zender, Charles S.
2017-01-27
The netCDF-4 format is widely used for large gridded scientific data sets and includes several compression methods: lossy linear scaling and the non-lossy deflate and shuffle algorithms. Many multidimensional geoscientific data sets exhibit considerable variation over one or several spatial dimensions (e.g., vertically) with less variation in the remaining dimensions (e.g., horizontally). On such data sets, linear scaling with a single pair of scale and offset parameters often entails considerable loss of precision. We introduce an alternative compression method called "layer-packing" that simultaneously exploits lossy linear scaling and lossless compression. Layer-packing stores arrays (instead of a scalar pair) of scalemore » and offset parameters. An implementation of this method is compared with lossless compression, storing data at fixed relative precision (bit-grooming) and scalar linear packing in terms of compression ratio, accuracy and speed. When viewed as a trade-off between compression and error, layer-packing yields similar results to bit-grooming (storing between 3 and 4 significant figures). Bit-grooming and layer-packing offer significantly better control of precision than scalar linear packing. Relative performance, in terms of compression and errors, of bit-groomed and layer-packed data were strongly predicted by the entropy of the exponent array, and lossless compression was well predicted by entropy of the original data array. Layer-packed data files must be "unpacked" to be readily usable. The compression and precision characteristics make layer-packing a competitive archive format for many scientific data sets.« less
Recent Advances in Measurement and Dietary Mitigation of Enteric Methane Emissions in Ruminants
Patra, Amlan K.
2016-01-01
Methane (CH4) emission, which is mainly produced during normal fermentation of feeds by the rumen microorganisms, represents a major contributor to the greenhouse gas (GHG) emissions. Several enteric CH4 mitigation technologies have been explored recently. A number of new techniques have also been developed and existing techniques have been improved in order to evaluate CH4 mitigation technologies and prepare an inventory of GHG emissions precisely. The aim of this review is to discuss different CH4 measuring and mitigation technologies, which have been recently developed. Respiration chamber technique is still considered as a gold standard technique due to its greater precision and reproducibility in CH4 measurements. With the adoption of recent recommendations for improving the technique, the SF6 method can be used with a high level of precision similar to the chamber technique. Short-term measurement techniques of CH4 measurements generally invite considerable within- and between-animal variations. Among the short-term measuring techniques, Greenfeed and methane hood systems are likely more suitable for evaluation of CH4 mitigation studies, if measurements could be obtained at different times of the day relative to the diurnal cycle of the CH4 production. Carbon dioxide and CH4 ratio, sniffer, and other short-term breath analysis techniques are more suitable for on farm screening of large number of animals to generate the data of low CH4-producing animals for genetic selection purposes. Different indirect measuring techniques are also investigated in recent years. Several new dietary CH4 mitigation technologies have been explored, but only a few of them are practical and cost-effective. Future research should be directed toward both the medium- and long-term mitigation strategies, which could be utilized on farms to accomplish substantial reductions of CH4 emissions and to profitably reduce carbon footprint of livestock production systems. This review presents recent developments and critical analysis on different measurements and dietary mitigation of enteric CH4 emissions technologies. PMID:27243027
Recent Advances in Measurement and Dietary Mitigation of Enteric Methane Emissions in Ruminants.
Patra, Amlan K
2016-01-01
Methane (CH4) emission, which is mainly produced during normal fermentation of feeds by the rumen microorganisms, represents a major contributor to the greenhouse gas (GHG) emissions. Several enteric CH4 mitigation technologies have been explored recently. A number of new techniques have also been developed and existing techniques have been improved in order to evaluate CH4 mitigation technologies and prepare an inventory of GHG emissions precisely. The aim of this review is to discuss different CH4 measuring and mitigation technologies, which have been recently developed. Respiration chamber technique is still considered as a gold standard technique due to its greater precision and reproducibility in CH4 measurements. With the adoption of recent recommendations for improving the technique, the SF6 method can be used with a high level of precision similar to the chamber technique. Short-term measurement techniques of CH4 measurements generally invite considerable within- and between-animal variations. Among the short-term measuring techniques, Greenfeed and methane hood systems are likely more suitable for evaluation of CH4 mitigation studies, if measurements could be obtained at different times of the day relative to the diurnal cycle of the CH4 production. Carbon dioxide and CH4 ratio, sniffer, and other short-term breath analysis techniques are more suitable for on farm screening of large number of animals to generate the data of low CH4-producing animals for genetic selection purposes. Different indirect measuring techniques are also investigated in recent years. Several new dietary CH4 mitigation technologies have been explored, but only a few of them are practical and cost-effective. Future research should be directed toward both the medium- and long-term mitigation strategies, which could be utilized on farms to accomplish substantial reductions of CH4 emissions and to profitably reduce carbon footprint of livestock production systems. This review presents recent developments and critical analysis on different measurements and dietary mitigation of enteric CH4 emissions technologies.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Improved DORIS accuracy for precise orbit determination and geodesy
NASA Technical Reports Server (NTRS)
Willis, Pascal; Jayles, Christian; Tavernier, Gilles
2004-01-01
In 2001 and 2002, 3 more DORIS satellites were launched. Since then, all DORIS results have been significantly improved. For precise orbit determination, 20 cm are now available in real-time with DIODE and 1.5 to 2 cm in post-processing. For geodesy, 1 cm precision can now be achieved regularly every week, making now DORIS an active part of a Global Observing System for Geodesy through the IDS.
NASA Astrophysics Data System (ADS)
Hishikawa, Yoshihiro; Doi, Takuya; Higa, Michiya; Ohshima, Hironori; Takenouchi, Takakazu; Yamagoe, Kengo
2017-08-01
Precise outdoor measurement of the current-voltage (I-V) curves of photovoltaic (PV) modules is desired for many applications such as low-cost onsite performance measurement, monitoring, and diagnosis. Conventional outdoor measurement technologies have a problem in that their precision is low when the solar irradiance is unstable, hence, limiting the opportunity of precise measurement only on clear sunny days. The purpose of this study is to investigate an outdoor measurement procedure, that can improve both the measurement opportunity and precision. Fast I-V curve measurements within 0.2 s and synchronous measurement of irradiance using a PV module irradiance sensor very effectively improved the precision. A small standard deviation (σ) of the module’s maximum output power (P max) in the range of 0.7-0.9% is demonstrated, based on the basis of a 6 month experiment, that mainly includes partly sunny days and cloudy days, during which the solar irradiance is unstable. The σ was further improved to 0.3-0.5% by correcting the curves for the small variation of irradiance. This indicates that the procedure of this study enables much more reproducible I-V curve measurements than a conventional usual procedure under various climatic conditions. Factors that affect measurement results are discussed, to further improve the precision.
Holman, Benjamin W B; Collins, Damian; Kilgannon, Ashleigh K; Hopkins, David L
2018-01-01
The Nix Pro Colour Sensor™ (NIX) can be potentially used to measure meat colour, but procedural guidelines that assure measurement reproducibility and repeatability (precision) must first be established. Technical replicate number (r) will minimise response variation, measureable as standard error of predicted mean (SEM), and contribute to improved precision. Consequently, we aimed to explore the effects of r on NIX precision when measuring aged beef colour (colorimetrics; L*, a*, b*, hue and chroma values). Each colorimetric SEM declined with increasing r to indicate improved precision and followed a diminishing rate of improvement that allowed us to recommend r=7 for meat colour studies using the NIX. This definition was based on practical limitations and a* variability, as additional r would be required if other colorimetrics or advanced levels of precision are necessary. Beef ageing and display period, holding temperature, loin and sampled portion were also found to contribute to colorimetric variation, but were incorporated within our definition of r. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Research: increasing value, reducing waste 2
Ioannidis, John P A; Greenland, Sander; Hlatky, Mark A; Khoury, Muin J; Macleod, Malcolm R; Moher, David; Schulz, Kenneth F; Tibshirani, Robert
2015-01-01
Correctable weaknesses in the design, conduct, and analysis of biomedical and public health research studies can produce misleading results and waste valuable resources. Small effects can be difficult to distinguish from bias introduced by study design and analyses. An absence of detailed written protocols and poor documentation of research is common. Information obtained might not be useful or important, and statistical precision or power is often too low or used in a misleading way. Insufficient consideration might be given to both previous and continuing studies. Arbitrary choice of analyses and an overemphasis on random extremes might affect the reported findings. Several problems relate to the research workforce, including failure to involve experienced statisticians and methodologists, failure to train clinical researchers and laboratory scientists in research methods and design, and the involvement of stakeholders with conflicts of interest. Inadequate emphasis is placed on recording of research decisions and on reproducibility of research. Finally, reward systems incentivise quantity more than quality, and novelty more than reliability. We propose potential solutions for these problems, including improvements in protocols and documentation, consideration of evidence from studies in progress, standardisation of research efforts, optimisation and training of an experienced and non-conflicted scientific workforce, and reconsideration of scientific reward systems. PMID:24411645
A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Lopes, Leonard V.
2017-01-01
More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.
A Review of Issues Related to Data Acquisition and Analysis in EEG/MEG Studies
Puce, Aina; Hämäläinen, Matti S.
2017-01-01
Electroencephalography (EEG) and magnetoencephalography (MEG) are non-invasive electrophysiological methods, which record electric potentials and magnetic fields due to electric currents in synchronously-active neurons. With MEG being more sensitive to neural activity from tangential currents and EEG being able to detect both radial and tangential sources, the two methods are complementary. Over the years, neurophysiological studies have changed considerably: high-density recordings are becoming de rigueur; there is interest in both spontaneous and evoked activity; and sophisticated artifact detection and removal methods are available. Improved head models for source estimation have also increased the precision of the current estimates, particularly for EEG and combined EEG/MEG. Because of their complementarity, more investigators are beginning to perform simultaneous EEG/MEG studies to gain more complete information about neural activity. Given the increase in methodological complexity in EEG/MEG, it is important to gather data that are of high quality and that are as artifact free as possible. Here, we discuss some issues in data acquisition and analysis of EEG and MEG data. Practical considerations for different types of EEG and MEG studies are also discussed. PMID:28561761
Note: Precise radial distribution of charged particles in a magnetic guiding field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backe, H., E-mail: backe@kph.uni-mainz.de
2015-07-15
Current high precision beta decay experiments of polarized neutrons, employing magnetic guiding fields in combination with position sensitive and energy dispersive detectors, resulted in a detailed study of the mono-energetic point spread function (PSF) for a homogeneous magnetic field. A PSF describes the radial probability distribution of mono-energetic electrons at the detector plane emitted from a point-like source. With regard to accuracy considerations, unwanted singularities occur as a function of the radial detector coordinate which have recently been investigated by subdividing the radial coordinate into small bins or employing analytical approximations. In this note, a series expansion of the PSFmore » is presented which can numerically be evaluated with arbitrary precision.« less
Preliminary design considerations for 10 to 40 meter-diameter precision truss reflectors
NASA Technical Reports Server (NTRS)
Mikulas, Martin M., Jr.; Collins, Timothy J.; Hedgepeth, John M.
1990-01-01
A simplified preliminary design capability for erectable precision segmented reflectors is presented. This design capability permits a rapid assessment of a wide range of reflector parameters as well as new structural concepts and materials. The preliminary design approach was applied to a range of precision reflectors from 10 meters to 100 meters in diameter while considering standard design drivers. The design drivers considered were: weight, fundamental frequency, launch packaging volume, part count, and on-orbit assembly time. For the range of parameters considered, on-orbit assembly time was identified as the major design driver. A family of modular panels is introduced which can significantly reduce the number of reflector parts and the on-orbit assembly time.
Haiyang, Yu; Tian, Luo
2016-06-01
Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.
Using Precision in STEM Language: A Qualitative Look
ERIC Educational Resources Information Center
Capraro, Mary M.; Bicer, Ali; Grant, Melva R.; Lincoln, Yvonna S.
2017-01-01
Teachers need to develop a variety of pedagogical strategies that can encourage precise and accurate communication--an extremely important 21st century skill. Precision with STEM oral language is essential. Emphasizing oral communication with precise language in combination with increased spatial skills with modeling can improve the chances of…
United States Air Force Graduate Student Summer Support Program (1985). Technical Report. Volume 2.
1985-12-01
C. , "A Thermodynamic and Continuum Approach to the Design and Control of Precision Forging Processes," Master’s Thesis , Wright State University, Aug...on mobile platforms, space will usually be a design consideration. This consideration will 48-4 •.J o,-. " limit the size of the laser used with the...Dichromated Gelatin Emulsions for Recording Phase Holograms," Master’s Thesis USAF Institute of Technology, December 1975, AD-A019320- 7. Graube, A
Network Design in Close-Range Photogrammetry with Short Baseline Images
NASA Astrophysics Data System (ADS)
Barazzetti, L.
2017-08-01
The avaibility of automated software for image-based 3D modelling has changed the way people acquire images for photogrammetric applications. Short baseline images are required to match image points with SIFT-like algorithms, obtaining more images than those necessary for "old fashioned" photogrammetric projects based on manual measurements. This paper describes some considerations on network design for short baseline image sequences, especially on precision and reliability of bundle adjustment. Simulated results reveal that the large number of 3D points used for image orientation has very limited impact on network precision.
Tunable laser techniques for improving the precision of observational astronomy
NASA Astrophysics Data System (ADS)
Cramer, Claire E.; Brown, Steven W.; Lykke, Keith R.; Woodward, John T.; Bailey, Stephen; Schlegel, David J.; Bolton, Adam S.; Brownstein, Joel; Doherty, Peter E.; Stubbs, Christopher W.; Vaz, Amali; Szentgyorgyi, Andrew
2012-09-01
Improving the precision of observational astronomy requires not only new telescopes and instrumentation, but also advances in observing protocols, calibrations and data analysis. The Laser Applications Group at the National Institute of Standards and Technology in Gaithersburg, Maryland has been applying advances in detector metrology and tunable laser calibrations to problems in astronomy since 2007. Using similar measurement techniques, we have addressed a number of seemingly disparate issues: precision flux calibration for broad-band imaging, precision wavelength calibration for high-resolution spectroscopy, and precision PSF mapping for fiber spectrographs of any resolution. In each case, we rely on robust, commercially-available laboratory technology that is readily adapted to use at an observatory. In this paper, we give an overview of these techniques.
NASA Technical Reports Server (NTRS)
Gerdes, R. M.
1980-01-01
Results from a series of simulation and flight investigations undertaken to evaluate helicopter flying qualities and the effects of control system augmentation for nap-of-the-earth (NOE) agility and instrument flying tasks were analyzed to assess handling-quality factors common to both tasks. Precise attitude control was determined to be a key requirement for successful accomplishment of both tasks. Factors that degraded attitude controllability were improper levels of control sensitivity and damping and rotor-system cross-coupling due to helicopter angular rate and collective pitch input. Application of rate-command, attitude-command, and control-input decouple augmentation schemes enhanced attitude control and significantly improved handling qualities for both tasks. NOE agility and instrument flying handling-quality considerations, pilot rating philosophy, and supplemental flight evaluations are also discussed.
Leidinger, F; Jörgens, V; Chantelau, E; Berchtold, P; Berger, M
1980-07-26
Home blood glucose monitoring by diabetic patients has recently been advocated as an effective means to improve metabolic control. The Glucocheck apparatus, a pocket-size battery-driven reflectance-meter (in Germany commercially available under the name Glucose-meter), has been evaluated for accuracy and practicability. In 450 blood glucose measurements, the variance between the values obtained using the Glucocheck apparatus and routine clinical laboratory procedures was +/- 11.7%. Especially in the low range of blood glucose concentrations, the Glucocheck method was very reliable. The quantitative precision of the Glucocheck method depends, however, quite considerably on the ability of the patient to use the apparatus correctly. In order to profit from Glucocheck in clinical practice, particular efforts to educate the patients in its use are necessary.
Ectopic breast cancer: case report and review of the literature.
Francone, Elisa; Nathan, Marco J; Murelli, Federica; Bruno, Maria Santina; Traverso, Enrico; Friedman, Daniele
2013-08-01
Ectopic breast tissue comes in two forms: supernumerary and aberrant. Despite morphologic differences, ectopic breast tissue presents characteristics analogous to orthotopic breast tissue in terms of function and, most importantly, pathologic degeneration. Data in the literature concerning its precise incidence, the probability of malignant degeneration, and its standardized management are scarce and controversial. This study selected more than 100 years of literature, and this report discusses a case of ectopic breast cancer treatment, suggesting novel therapeutic advice that could bring considerable clinical advantages, improve cosmetic results, and reduce the psychological impact on patients. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Anisovich, A. V.; Beck, R.; Döring, M.; ...
2016-09-16
New data on pion-photoproduction off the proton have been included in the partial wave analyses Bonn-Gatchina and SAID and in the dynamical coupled-channel approach Julich-Bonn. All reproduce the recent new data well: the double polarization data for E, G, H, P and T inmore » $$\\gamma p \\to \\pi^0 p$$ from ELSA, the beam asymmetry $$\\Sigma$$ for $$\\gamma p \\to \\pi^0 p$$ and $$\\pi^+ n$$ from Jefferson Laboratory, and the precise new differential cross section and beam asymmetry data $$\\Sigma$$ for $$\\gamma p \\to \\pi^0 p$$ from MAMI. The new fit results for the multipoles are compared with predictions not taking into account the new data. Lastly, the mutual agreement is improved considerably but still far from being perfect.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anisovich, A. V.; Beck, R.; Döring, M.
New data on pion-photoproduction off the proton have been included in the partial wave analyses Bonn-Gatchina and SAID and in the dynamical coupled-channel approach Julich-Bonn. All reproduce the recent new data well: the double polarization data for E, G, H, P and T inmore » $$\\gamma p \\to \\pi^0 p$$ from ELSA, the beam asymmetry $$\\Sigma$$ for $$\\gamma p \\to \\pi^0 p$$ and $$\\pi^+ n$$ from Jefferson Laboratory, and the precise new differential cross section and beam asymmetry data $$\\Sigma$$ for $$\\gamma p \\to \\pi^0 p$$ from MAMI. The new fit results for the multipoles are compared with predictions not taking into account the new data. Lastly, the mutual agreement is improved considerably but still far from being perfect.« less
Churnside, Allison B; Sullan, Ruby May A; Nguyen, Duc M; Case, Sara O; Bull, Matthew S; King, Gavin M; Perkins, Thomas T
2012-07-11
Force drift is a significant, yet unresolved, problem in atomic force microscopy (AFM). We show that the primary source of force drift for a popular class of cantilevers is their gold coating, even though they are coated on both sides to minimize drift. Drift of the zero-force position of the cantilever was reduced from 900 nm for gold-coated cantilevers to 70 nm (N = 10; rms) for uncoated cantilevers over the first 2 h after wetting the tip; a majority of these uncoated cantilevers (60%) showed significantly less drift (12 nm, rms). Removing the gold also led to ∼10-fold reduction in reflected light, yet short-term (0.1-10 s) force precision improved. Moreover, improved force precision did not require extended settling; most of the cantilevers tested (9 out of 15) achieved sub-pN force precision (0.54 ± 0.02 pN) over a broad bandwidth (0.01-10 Hz) just 30 min after loading. Finally, this precision was maintained while stretching DNA. Hence, removing gold enables both routine and timely access to sub-pN force precision in liquid over extended periods (100 s). We expect that many current and future applications of AFM can immediately benefit from these improvements in force stability and precision.
Orbit Determination for the Lunar Reconnaissance Orbiter Using an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Slojkowski, Steven; Lowe, Jonathan; Woodburn, James
2015-01-01
Orbit determination (OD) analysis results are presented for the Lunar Reconnaissance Orbiter (LRO) using a commercially available Extended Kalman Filter, Analytical Graphics' Orbit Determination Tool Kit (ODTK). Process noise models for lunar gravity and solar radiation pressure (SRP) are described and OD results employing the models are presented. Definitive accuracy using ODTK meets mission requirements and is better than that achieved using the operational LRO OD tool, the Goddard Trajectory Determination System (GTDS). Results demonstrate that a Vasicek stochastic model produces better estimates of the coefficient of solar radiation pressure than a Gauss-Markov model, and prediction accuracy using a Vasicek model meets mission requirements over the analysis span. Modeling the effect of antenna motion on range-rate tracking considerably improves residuals and filter-smoother consistency. Inclusion of off-axis SRP process noise and generalized process noise improves filter performance for both definitive and predicted accuracy. Definitive accuracy from the smoother is better than achieved using GTDS and is close to that achieved by precision OD methods used to generate definitive science orbits. Use of a multi-plate dynamic spacecraft area model with ODTK's force model plugin capability provides additional improvements in predicted accuracy.
Athens, Jessica K.; Remington, Patrick L.; Gangnon, Ronald E.
2015-01-01
Objectives The University of Wisconsin Population Health Institute has published the County Health Rankings since 2010. These rankings use population-based data to highlight health outcomes and the multiple determinants of these outcomes and to encourage in-depth health assessment for all United States counties. A significant methodological limitation, however, is the uncertainty of rank estimates, particularly for small counties. To address this challenge, we explore the use of longitudinal and pooled outcome data in hierarchical Bayesian models to generate county ranks with greater precision. Methods In our models we used pooled outcome data for three measure groups: (1) Poor physical and poor mental health days; (2) percent of births with low birth weight and fair or poor health prevalence; and (3) age-specific mortality rates for nine age groups. We used the fixed and random effects components of these models to generate posterior samples of rates for each measure. We also used time-series data in longitudinal random effects models for age-specific mortality. Based on the posterior samples from these models, we estimate ranks and rank quartiles for each measure, as well as the probability of a county ranking in its assigned quartile. Rank quartile probabilities for univariate, joint outcome, and/or longitudinal models were compared to assess improvements in rank precision. Results The joint outcome model for poor physical and poor mental health days resulted in improved rank precision, as did the longitudinal model for age-specific mortality rates. Rank precision for low birth weight births and fair/poor health prevalence based on the univariate and joint outcome models were equivalent. Conclusion Incorporating longitudinal or pooled outcome data may improve rank certainty, depending on characteristics of the measures selected. For measures with different determinants, joint modeling neither improved nor degraded rank precision. This approach suggests a simple way to use existing information to improve the precision of small-area measures of population health. PMID:26098858
Supertitrations: High-Precision Methods.
ERIC Educational Resources Information Center
Guenther, W. B.
1988-01-01
Offers challenging work at a higher level of technique than most students meet in elementary laboratory work. Uses a combined weight and volumetric sequence not shown in textbooks. Notes modern rapid balances help lower evaporation loss during weighings. Discusses the balance, weights, and buoyancy considerations. (MVL)
Inci, Fatih; Filippini, Chiara; Baday, Murat; Ozen, Mehmet Ozgun; Calamak, Semih; Durmus, Naside Gozde; Wang, ShuQi; Hanhauser, Emily; Hobbs, Kristen S; Juillard, Franceline; Kuang, Ping Ping; Vetter, Michael L; Carocci, Margot; Yamamoto, Hidemi S; Takagi, Yuko; Yildiz, Umit Hakan; Akin, Demir; Wesemann, Duane R; Singhal, Amit; Yang, Priscilla L; Nibert, Max L; Fichorova, Raina N; Lau, Daryl T-Y; Henrich, Timothy J; Kaye, Kenneth M; Schachter, Steven C; Kuritzkes, Daniel R; Steinmetz, Lars M; Gambhir, Sanjiv S; Davis, Ronald W; Demirci, Utkan
2015-08-11
Recent advances in biosensing technologies present great potential for medical diagnostics, thus improving clinical decisions. However, creating a label-free general sensing platform capable of detecting multiple biotargets in various clinical specimens over a wide dynamic range, without lengthy sample-processing steps, remains a considerable challenge. In practice, these barriers prevent broad applications in clinics and at patients' homes. Here, we demonstrate the nanoplasmonic electrical field-enhanced resonating device (NE(2)RD), which addresses all these impediments on a single platform. The NE(2)RD employs an immunodetection assay to capture biotargets, and precisely measures spectral color changes by their wavelength and extinction intensity shifts in nanoparticles without prior sample labeling or preprocessing. We present through multiple examples, a label-free, quantitative, portable, multitarget platform by rapidly detecting various protein biomarkers, drugs, protein allergens, bacteria, eukaryotic cells, and distinct viruses. The linear dynamic range of NE(2)RD is five orders of magnitude broader than ELISA, with a sensitivity down to 400 fg/mL This range and sensitivity are achieved by self-assembling gold nanoparticles to generate hot spots on a 3D-oriented substrate for ultrasensitive measurements. We demonstrate that this precise platform handles multiple clinical samples such as whole blood, serum, and saliva without sample preprocessing under diverse conditions of temperature, pH, and ionic strength. The NE(2)RD's broad dynamic range, detection limit, and portability integrated with a disposable fluidic chip have broad applications, potentially enabling the transition toward precision medicine at the point-of-care or primary care settings and at patients' homes.
Design Considerations for the Installation of an Iodine (I2) Cell onto TRES
NASA Astrophysics Data System (ADS)
Garcia-Mejia, Juliana
2017-01-01
The radial velocity (RV) method utilizes the reflex motion of a target star to predict the presence of one or multiple exoplanets. However, the disparity in mass between planet and host star often results in RV oscillations below the precision of most modern spectrographs. Such is the case of TRES, the Tillinghast Reflector Echelle Spectrograph located in the Fred Lawrence Whipple Observatory in Mt. Hopkins, Arizona, with a radial velocity (RV) precision of ~ 20 m s-1, dominated by instrumental effects. Since 1992, the iodine cell technique, presented in Butler et al.(1992) has become widely used for the reduction of RV measurement errors. Here, we describe the beginning stages in the installation of one such cell onto TRES. After traveling to the telescope site to perform the first fitting of the iodine stage, I designed, built and fitted the first prototype of an improved thermal insulation system for the front end of the spectrograph, where the cell will be mounted. Here I present such a design, as well as a detailed description of the current state of the project. We expect the iodine cell to be fully functional in approximately 1 year. Once the cell is installed, we expect errors in radial velocity measurements to decrease by an order of magnitude from the aforementioned 20 m s-1. This increase in precision will come with an increase in stability of radial velocity measurements, allowing TRES to perform in-house spectroscopy of more nearby bright targets and high-cadence exoplanet follow-up.
Parameter estimation in plasmonic QED
NASA Astrophysics Data System (ADS)
Jahromi, H. Rangani
2018-03-01
We address the problem of parameter estimation in the presence of plasmonic modes manipulating emitted light via the localized surface plasmons in a plasmonic waveguide at the nanoscale. The emitter that we discuss is the nitrogen vacancy centre (NVC) in diamond modelled as a qubit. Our goal is to estimate the β factor measuring the fraction of emitted energy captured by waveguide surface plasmons. The best strategy to obtain the most accurate estimation of the parameter, in terms of the initial state of the probes and different control parameters, is investigated. In particular, for two-qubit estimation, it is found although we may achieve the best estimation at initial instants by using the maximally entangled initial states, at long times, the optimal estimation occurs when the initial state of the probes is a product one. We also find that decreasing the interqubit distance or increasing the propagation length of the plasmons improve the precision of the estimation. Moreover, decrease of spontaneous emission rate of the NVCs retards the quantum Fisher information (QFI) reduction and therefore the vanishing of the QFI, measuring the precision of the estimation, is delayed. In addition, if the phase parameter of the initial state of the two NVCs is equal to πrad, the best estimation with the two-qubit system is achieved when initially the NVCs are maximally entangled. Besides, the one-qubit estimation has been also analysed in detail. Especially, we show that, using a two-qubit probe, at any arbitrary time, enhances considerably the precision of estimation in comparison with one-qubit estimation.
Inci, Fatih; Filippini, Chiara; Ozen, Mehmet Ozgun; Calamak, Semih; Durmus, Naside Gozde; Wang, ShuQi; Hanhauser, Emily; Hobbs, Kristen S.; Juillard, Franceline; Kuang, Ping Ping; Vetter, Michael L.; Carocci, Margot; Yamamoto, Hidemi S.; Takagi, Yuko; Yildiz, Umit Hakan; Akin, Demir; Wesemann, Duane R.; Singhal, Amit; Yang, Priscilla L.; Nibert, Max L.; Fichorova, Raina N.; Lau, Daryl T.-Y.; Henrich, Timothy J.; Kaye, Kenneth M.; Schachter, Steven C.; Kuritzkes, Daniel R.; Steinmetz, Lars M.; Gambhir, Sanjiv S.; Davis, Ronald W.; Demirci, Utkan
2015-01-01
Recent advances in biosensing technologies present great potential for medical diagnostics, thus improving clinical decisions. However, creating a label-free general sensing platform capable of detecting multiple biotargets in various clinical specimens over a wide dynamic range, without lengthy sample-processing steps, remains a considerable challenge. In practice, these barriers prevent broad applications in clinics and at patients’ homes. Here, we demonstrate the nanoplasmonic electrical field-enhanced resonating device (NE2RD), which addresses all these impediments on a single platform. The NE2RD employs an immunodetection assay to capture biotargets, and precisely measures spectral color changes by their wavelength and extinction intensity shifts in nanoparticles without prior sample labeling or preprocessing. We present through multiple examples, a label-free, quantitative, portable, multitarget platform by rapidly detecting various protein biomarkers, drugs, protein allergens, bacteria, eukaryotic cells, and distinct viruses. The linear dynamic range of NE2RD is five orders of magnitude broader than ELISA, with a sensitivity down to 400 fg/mL This range and sensitivity are achieved by self-assembling gold nanoparticles to generate hot spots on a 3D-oriented substrate for ultrasensitive measurements. We demonstrate that this precise platform handles multiple clinical samples such as whole blood, serum, and saliva without sample preprocessing under diverse conditions of temperature, pH, and ionic strength. The NE2RD’s broad dynamic range, detection limit, and portability integrated with a disposable fluidic chip have broad applications, potentially enabling the transition toward precision medicine at the point-of-care or primary care settings and at patients’ homes. PMID:26195743
Feedback attitude sliding mode regulation control of spacecraft using arm motion
NASA Astrophysics Data System (ADS)
Shi, Ye; Liang, Bin; Xu, Dong; Wang, Xueqian; Xu, Wenfu
2013-09-01
The problem of spacecraft attitude regulation based on the reaction of arm motion has attracted extensive attentions from both engineering and academic fields. Most of the solutions of the manipulator’s motion tracking problem just achieve asymptotical stabilization performance, so that these controllers cannot realize precise attitude regulation because of the existence of non-holonomic constraints. Thus, sliding mode control algorithms are adopted to stabilize the tracking error with zero transient process. Due to the switching effects of the variable structure controller, once the tracking error reaches the designed hyper-plane, it will be restricted to this plane permanently even with the existence of external disturbances. Thus, precise attitude regulation can be achieved. Furthermore, taking the non-zero initial tracking errors and chattering phenomenon into consideration, saturation functions are used to replace sign functions to smooth the control torques. The relations between the upper bounds of tracking errors and the controller parameters are derived to reveal physical characteristic of the controller. Mathematical models of free-floating space manipulator are established and simulations are conducted in the end. The results show that the spacecraft’s attitude can be regulated to the position as desired by using the proposed algorithm, the steady state error is 0.000 2 rad. In addition, the joint tracking trajectory is smooth, the joint tracking errors converges to zero quickly with a satisfactory continuous joint control input. The proposed research provides a feasible solution for spacecraft attitude regulation by using arm motion, and improves the precision of the spacecraft attitude regulation.
Quantifying the Precision of Single-Molecule Torque and Twist Measurements Using Allan Variance.
van Oene, Maarten M; Ha, Seungkyu; Jager, Tessa; Lee, Mina; Pedaci, Francesco; Lipfert, Jan; Dekker, Nynke H
2018-04-24
Single-molecule manipulation techniques have provided unprecedented insights into the structure, function, interactions, and mechanical properties of biological macromolecules. Recently, the single-molecule toolbox has been expanded by techniques that enable measurements of rotation and torque, such as the optical torque wrench (OTW) and several different implementations of magnetic (torque) tweezers. Although systematic analyses of the position and force precision of single-molecule techniques have attracted considerable attention, their angle and torque precision have been treated in much less detail. Here, we propose Allan deviation as a tool to systematically quantitate angle and torque precision in single-molecule measurements. We apply the Allan variance method to experimental data from our implementations of (electro)magnetic torque tweezers and an OTW and find that both approaches can achieve a torque precision better than 1 pN · nm. The OTW, capable of measuring torque on (sub)millisecond timescales, provides the best torque precision for measurement times ≲10 s, after which drift becomes a limiting factor. For longer measurement times, magnetic torque tweezers with their superior stability provide the best torque precision. Use of the Allan deviation enables critical assessments of the torque precision as a function of measurement time across different measurement modalities and provides a tool to optimize measurement protocols for a given instrument and application. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Azarmehr, Iman; Stokbro, Kasper; Bell, R Bryan; Thygesen, Torben
2017-09-01
This systematic review investigates the most common indications, treatments, and outcomes of surgical navigation (SN) published from 2010 to 2015. The evolution of SN and its application in oral and maxillofacial surgery have rapidly developed over recent years, and therapeutic indications are discussed. A systematic search in relevant electronic databases, journals, and bibliographies of the included articles was carried out. Clinical studies with 5 or more patients published between 2010 and 2015 were included. Traumatology, orthognathic surgery, cancer and reconstruction surgery, skull-base surgery, and foreign body removal were the areas of interests. The search generated 13 articles dealing with traumatology; 5, 6, 2, and 0 studies were found that dealt with the topics of orthognathic surgery, cancer and reconstruction surgery, skull-base surgery, and foreign body removal, respectively. The average technical system accuracy and intraoperative precision reported were less than 1 mm and 1 to 2 mm, respectively. In general, SN is reported to be a useful tool for surgical planning, execution, evaluation, and research. The largest numbers of studies and patients were identified in the field of traumatology. Treatment of complex orbital fractures was considerably improved by the use of SN compared with traditionally treated control groups. SN seems to be a very promising addition to the surgical toolkit. Planning details of the surgical procedure in a 3-dimensional virtual environment and execution with real-time guidance can significantly improve precision. Among factors to be considered are the financial investments necessary and the learning curve. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. All rights reserved.
NASA Astrophysics Data System (ADS)
Hönicke, Philipp; Krämer, Markus; Lühl, Lars; Andrianov, Konstantin; Beckhoff, Burkhard; Dietsch, Rainer; Holz, Thomas; Kanngießer, Birgit; Weißbach, Danny; Wilhein, Thomas
2018-07-01
With the advent of both modern X-ray fluorescence (XRF) methods and improved analytical reliability requirements the demand for suitable reference samples has increased. Especially in nanotechnology with the very low areal mass depositions, quantification becomes considerably more difficult. However, the availability of suited reference samples is drastically lower than the demand. Physical vapor deposition techniques have been enhanced significantly in the last decade driven by the need for extremely precise film parameters in multilayer production. We have applied those techniques for the development of layer-like reference samples with mass depositions in the ng-range and well below for Ca, Cu, Pb, Mo, Pd, Pb, La, Fe and Ni. Numerous other elements would also be possible. Several types of reference samples were fabricated: multi-elemental layer and extremely low (sub-monolayer) samples for various applications in XRF and total-reflection XRF analysis. Those samples were characterized and compared at three different synchrotron radiation beamlines at the BESSY II electron storage ring employing the reference-free XRF approach based on physically calibrated instrumentation. In addition, the homogeneity of the multi-elemental coatings was checked at the P04 beamline at DESY. The measurements demonstrate the high precision achieved in the manufacturing process as well as the versatility of application fields for the presented reference samples.
Judah, Gaby; de Witt Huberts, Jessie; Drassal, Allan; Aunger, Robert
2017-01-01
The accurate measurement of behaviour is vitally important to many disciplines and practitioners of various kinds. While different methods have been used (such as observation, diaries, questionnaire), none are able to accurately monitor behaviour over the long term in the natural context of people's own lives. The aim of this work was therefore to develop and test a reliable system for unobtrusively monitoring various behaviours of multiple individuals within the same household over a period of several months. A commercial Real Time Location System was adapted to meet these requirements and subsequently validated in three households by monitoring various bathroom behaviours. The results indicate that the system is robust, can monitor behaviours over the long-term in different households and can reliably distinguish between individuals. Precision rates were high and consistent. Recall rates were less consistent across households and behaviours, although recall rates improved considerably with practice at set-up of the system. The achieved precision and recall rates were comparable to the rates observed in more controlled environments using more valid methods of ground truthing. These initial findings indicate that the system is a valuable, flexible and robust system for monitoring behaviour in its natural environment that would allow new research questions to be addressed.
Toward robot-assisted neurosurgical lasers.
Motkoski, Jason W; Yang, Fang Wei; Lwu, Shelly H H; Sutherland, Garnette R
2013-04-01
Despite the potential increase in precision and accuracy, laser technology is not widely used in neurological surgery. This in part relates to challenges associated with the early introduction of lasers into neurosurgery. Considerable advances in laser technology have occurred, which together with robotic technology could create an ideal platform for neurosurgical application. In this study, a 980-nm contact diode laser was integrated with neuroArm. Preclinical evaluation involved partial hepatectomy, bilateral nephrectomy, splenectomy, and bilateral submandibular gland excision in a Sprague-Dawley rat model (n = 50). Total surgical time, blood loss as weight of surgical gauze before and after the procedure, and the incidence of thermal, vascular, or lethal injury were recorded and converted to an overall performance score. Thermal damage was evaluated in the liver using tissue samples stained with hematoxylin and eosin. Clinical studies involved step-wise integration of the 980-nm laser system into four neurosurgical cases. Results demonstrate the successful integration of contact laser technology into microsurgery, with and without robotic assistance. In preclinical studies, the laser improved microsurgical performance and reduced thermal damage, while neuroArm decreased intra- and intersurgeon variability. Clinical studies demonstrate dutility in meningioma resection (n = 4). Together, laser and robotic technology offered a more consistent, expedient, and precise tool for microsurgery.
Mombo, S; Dumat, C; Shahid, M; Schreck, E
2017-02-01
Due to its high adaptability, cassava (Manihot esculenta Crantz) is one of the world's most cultivated and consumed plants after maize and rice. However, there are relatively few scientific studies on this important crop. The objective of this review was therefore to summarize and discuss the available information on cassava cropping in order to promote sustainable practices in terms of production and consumption. Cassava cultivation has been expanding recently at the global scale and is widely consumed in most regions of South America, Africa, and Asia. However, it is also characterized by the presence in its roots of potentially toxic hydrocyanic acid. Furthermore, cassava can also absorb pollutants as it is currently cultivated near roads or factories and generally without consideration for potential sources of soil, water, or atmospheric pollution. Careful washing, peeling, and adequate preparation before eating are therefore crucial steps for reducing human exposure to both environmental pollutants and natural hydrocyanic acid. At present, there is not enough precise data available on this staple food crop. To improve our knowledge on the nutritive benefits versus health risks associated with cassava consumption, further research is necessary to compare cassava cultivars and precisely study the influence of preparation methods.
Teleportation of quantum resources and quantum Fisher information under Unruh effect
NASA Astrophysics Data System (ADS)
Jafarzadeh, M.; Rangani Jahromi, H.; Amniat-Talab, M.
2018-07-01
Considering a pair of Unruh-DeWitt detectors, when one of them is kept inertial and the other one is accelerated and coupled to a scalar field, we address the teleportation of a two-qubit entangled state ( |ψ _in> = {cos} θ /2 |10> +e^{iφ} {sin} θ /2 |01> ) through the quantum channel created by the above system and investigate how thermal noise induced by Unruh effect affects the quantum resources and quantum Fisher information (QFI) teleportation. Our results showed while the teleported quantum resources and QFI with respect to phase parameter φ( F_{ {out}}( φ ) ) reduce with increasing acceleration and effective coupling, QFI with respect to weight parameter θ ( F_{ {out}}( θ ) ) interestingly increases after a specified value of acceleration and effective coupling. We also find that the teleported quantum resources and the precision of estimating phase parameter φ can be improved by a more entangled input state and more entangled channel. Moreover, the precision of estimating weight parameter θ increases for a maximally entangled input state only in large acceleration regime, while it does not change considerably for both maximally and partially entangled states of the channel. The influence of Unruh effect on fidelity of teleportation is also investigated. We showed that for small effective coupling the average fidelity is always larger than 2/3.
Hatfield, L.A.; Gutreuter, S.; Boogaard, M.A.; Carlin, B.P.
2011-01-01
Estimation of extreme quantal-response statistics, such as the concentration required to kill 99.9% of test subjects (LC99.9), remains a challenge in the presence of multiple covariates and complex study designs. Accurate and precise estimates of the LC99.9 for mixtures of toxicants are critical to ongoing control of a parasitic invasive species, the sea lamprey, in the Laurentian Great Lakes of North America. The toxicity of those chemicals is affected by local and temporal variations in water chemistry, which must be incorporated into the modeling. We develop multilevel empirical Bayes models for data from multiple laboratory studies. Our approach yields more accurate and precise estimation of the LC99.9 compared to alternative models considered. This study demonstrates that properly incorporating hierarchical structure in laboratory data yields better estimates of LC99.9 stream treatment values that are critical to larvae control in the field. In addition, out-of-sample prediction of the results of in situ tests reveals the presence of a latent seasonal effect not manifest in the laboratory studies, suggesting avenues for future study and illustrating the importance of dual consideration of both experimental and observational data. ?? 2011, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Ciofu, C.; Stan, G.
2016-08-01
The paper emphasise positioning precision of an elephant's trunk robotic arm which has joints driven by wires with variable length while operating The considered 5 degrees of freedom robotic arm has a particular structure of joint that makes possible inner actuation with wire-driven mechanism. We analyse solely the length change of wires as a consequence due inner winding and unwinding on joints for certain values of rotational angles. Variations in wires length entail joint angular displacements. We analyse positioning precision by taking into consideration equations from inverse kinematics of the elephant's trunk robotic arm. The angular displacements of joints are considered into computational method after partial derivation of positioning equations. We obtain variations of wires length at about tenths of micrometers. These variations employ angular displacements which are about minutes of sexagesimal degree and, thus, define positioning precision of elephant's trunk robotic arms. The analytical method is used for determining aftermath design structure of an elephant's trunk robotic arm with inner actuation through wires on positioning precision. Thus, designers could take suitable decisions on accuracy specifications limits of the robotic arm.
Problems, challenges and promises: perspectives on precision medicine.
Duffy, David J
2016-05-01
The 'precision medicine (systems medicine)' concept promises to achieve a shift to future healthcare systems with a more proactive and predictive approach to medicine, where the emphasis is on disease prevention rather than the treatment of symptoms. The individualization of treatment for each patient will be at the centre of this approach, with all of a patient's medical data being computationally integrated and accessible. Precision medicine is being rapidly embraced by biomedical researchers, pioneering clinicians and scientific funding programmes in both the European Union (EU) and USA. Precision medicine is a key component of both Horizon 2020 (the EU Framework Programme for Research and Innovation) and the White House's Precision Medicine Initiative. Precision medicine promises to revolutionize patient care and treatment decisions. However, the participants in precision medicine are faced with a considerable central challenge. Greater volumes of data from a wider variety of sources are being generated and analysed than ever before; yet, this heterogeneous information must be integrated and incorporated into personalized predictive models, the output of which must be intelligible to non-computationally trained clinicians. Drawing primarily from the field of 'oncology', this article will introduce key concepts and challenges of precision medicine and some of the approaches currently being implemented to overcome these challenges. Finally, this article also covers the criticisms of precision medicine overpromising on its potential to transform patient care. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Overcoming gaps and bottlenecks to advance precision agriculture
USDA-ARS?s Scientific Manuscript database
Maintaining a clear understanding of the technology gaps, knowledge needs, and training bottlenecks is required for improving adoption of precision agriculture. As an industry, precision agriculture embraces tools, methods, and practices that are constantly changing, requiring industry, education, a...
Assembling Precise Truss Structures With Minimal Stresses
NASA Technical Reports Server (NTRS)
Sword, Lee F.
1996-01-01
Improved method of assembling precise truss structures involves use of simple devices. Tapered pins that fit in tapered holes indicate deviations from prescribed lengths. Method both helps to ensure precision of finished structures and minimizes residual stresses within structures.
Multi-objective optimization in quantum parameter estimation
NASA Astrophysics Data System (ADS)
Gong, BeiLi; Cui, Wei
2018-04-01
We investigate quantum parameter estimation based on linear and Kerr-type nonlinear controls in an open quantum system, and consider the dissipation rate as an unknown parameter. We show that while the precision of parameter estimation is improved, it usually introduces a significant deformation to the system state. Moreover, we propose a multi-objective model to optimize the two conflicting objectives: (1) maximizing the Fisher information, improving the parameter estimation precision, and (2) minimizing the deformation of the system state, which maintains its fidelity. Finally, simulations of a simplified ɛ-constrained model demonstrate the feasibility of the Hamiltonian control in improving the precision of the quantum parameter estimation.
Mechanisms of Interaction in Speech Production
ERIC Educational Resources Information Center
Baese-Berk, Melissa; Goldrick, Matthew
2009-01-01
Many theories predict the presence of interactive effects involving information represented by distinct cognitive processes in speech production. There is considerably less agreement regarding the precise cognitive mechanisms that underlie these interactive effects. For example, are they driven by purely production-internal mechanisms (e.g., Dell,…
NASA Technical Reports Server (NTRS)
Patankar, Kunal; Fitz-Coy, Norman; Roithmayr, Carlos M.
2014-01-01
This paper presents the design as well as characterization of a practical control moment gyroscope (CMG) based attitude control system (ACS) for small satellites in the 15-20 kilogram mass range performing rapid retargeting and precision pointing maneuvers. The paper focuses on the approach taken in the design of miniaturized CMGs while considering the constraints imposed by the use of commercial off-the-shelf (COTS) components as well as the size of the satellite. It is shown that a hybrid mode is more suitable for COTS based moment exchange actuators; a mode that uses the torque amplification of CMGs for rapid retargeting and direct torque capabilities of the flywheel motors for precision pointing. A simulation is provided to demonstrate on-orbit slew and pointing performance.
Research on particle swarm optimization algorithm based on optimal movement probability
NASA Astrophysics Data System (ADS)
Ma, Jianhong; Zhang, Han; He, Baofeng
2017-01-01
The particle swarm optimization algorithm to improve the control precision, and has great application value training neural network and fuzzy system control fields etc.The traditional particle swarm algorithm is used for the training of feed forward neural networks,the search efficiency is low, and easy to fall into local convergence.An improved particle swarm optimization algorithm is proposed based on error back propagation gradient descent. Particle swarm optimization for Solving Least Squares Problems to meme group, the particles in the fitness ranking, optimization problem of the overall consideration, the error back propagation gradient descent training BP neural network, particle to update the velocity and position according to their individual optimal and global optimization, make the particles more to the social optimal learning and less to its optimal learning, it can avoid the particles fall into local optimum, by using gradient information can accelerate the PSO local search ability, improve the multi beam particle swarm depth zero less trajectory information search efficiency, the realization of improved particle swarm optimization algorithm. Simulation results show that the algorithm in the initial stage of rapid convergence to the global optimal solution can be near to the global optimal solution and keep close to the trend, the algorithm has faster convergence speed and search performance in the same running time, it can improve the convergence speed of the algorithm, especially the later search efficiency.
Effect of the Level of Coordinated Motor Abilities on Performance in Junior Judokas
Lech, Grzegorz; Jaworski, Janusz; Lyakh, Vladimir; Krawczyk, Robert
2011-01-01
The main focus of this study was to identify coordinated motor abilities that affect fighting methods and performance in junior judokas. Subjects were selected for the study in consideration of their age, competition experience, body mass and prior sports level. Subjects’ competition history was taken into consideration when analysing the effectiveness of current fight actions, and individual sports level was determined with consideration to rank in the analysed competitions. The study sought to determine the level of coordinated motor abilities of competitors. The scope of this analysis covered the following aspects: kinaesthetic differentiation, movement frequency, simple and selective reaction time (evoked by a visual or auditory stimulus), spatial orientation, visual-motor coordination, rhythmization, speed, accuracy and precision of movements and the ability to adapt movements and balance. A set of computer tests was employed for the analysis of all of the coordination abilities, while balance examinations were based on the Flamingo Balance Test. Finally, all relationships were determined based on the Spearman’s rank correlation coefficient. It was observed that the activity of the contestants during the fight correlated with the ability to differentiate movements and speed, accuracy and precision of movement, whereas the achievement level during competition was connected with reaction time. PMID:23486723
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Rock, Jack P; Ryu, Samuel; Yin, Fang-Fang; Schreiber, Faye; Abdulhak, Muwaffak
2004-01-01
Traditional management strategies for patients with spinal tumors have undergone considerable changes during the last 15 years. Significant improvements in digital imaging, computer processing, and treatment planning have provided the basis for the application of stereotactic techniques, now the standard of care for intracranial pathology, to spinal pathology. In addition, certain of these improvements have also allowed us to progress from frame-based to frameless systems which now act to accurately assure the delivery of high doses of radiation to a precisely defined target volume while sparing injury to adjacent normal tissues. In this article we will describe the evolution from yesterday's standards for radiation therapy to the current state of the art for the treatment of patients with spinal tumors. This presentation will include a discussion of radiation dosing and toxicity, the overall process of extracranial radiation delivery, and the current state of the art regarding Cyberknife, Novalis, and tomotherapy. Additional discussion relating current research protocols and future directions for the management of benign tumors of the spine will also be presented.
Basic principles of stability.
Egan, William; Schofield, Timothy
2009-11-01
An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.
Positioning challenges in reconfigurable semi-autonomous robotic NDE inspection
NASA Astrophysics Data System (ADS)
Pierce, S. Gareth; Dobie, Gordon; Summan, Rahul; Mackenzie, Liam; Hensman, James; Worden, Keith; Hayward, Gordon
2010-03-01
This paper describes work conducted into mobile, wireless, semi-autonomous NDE inspection robots developed at The University of Strathclyde as part of the UK Research Centre for Non Destructive Evaluation (RCNDE). The inspection vehicles can incorporate a number of different NDE payloads including ultrasonic, eddy current, visual and magnetic based payloads, and have been developed to try and improve NDE inspection techniques in challenging inspection areas (for example oil, gas, and nuclear structures). A significant research challenge remains in the accurate positioning and guidance of such vehicles for real inspection tasks. Employing both relative and absolute position measurements, we discuss a number of approaches to position estimation including Kalman and particle filtering. Using probabilistic approaches enables a common mathematical framework to be employed for both positioning and data fusion from different NDE sensors. In this fashion the uncertainties in both position and defect identification and classification can be dealt with using a consistent approach. A number of practical constraints and considerations to different precision positioning techniques are discussed, along with NDE applications and the potential for improved inspection capabilities by utilising the inherent reconfigurable capabilities of the inspection vehicles.
Piezoelectric technology in otolaryngology, and head and neck surgery: a review.
Meller, C; Havas, T E
2017-07-01
Piezoelectric technology has existed for many years as a surgical tool for precise removal of soft tissue and bone. The existing literature regarding its use specifically for otolaryngology, and head and neck surgery was reviewed. The databases Medline, the Cochrane Central Register of Controlled Trials, PubMed, Embase and Cambridge Scientific Abstracts were searched. Studies were selected and reviewed based on relevance. Sixty studies were identified and examined for evidence of benefits and disadvantages of piezoelectric surgery and its application in otolaryngology. The technique was compared with traditional surgical methods, in terms of intra-operative bleeding, histology, learning curve, operative time and post-operative pain. Piezoelectric technology has been successfully employed, particularly in otology and skull base surgery, where its specific advantages versus traditional drills include a lack of 'blunting' and tissue selectivity. Technical advantages include ease of use, a short learning curve and improved visibility. Its higher cost warrants consideration given that clinically significant improvements in operative time and morbidity have not yet been proven. Further studies may define the evolving role of piezoelectric surgery in otolaryngology, and head and neck surgery.
NASA Astrophysics Data System (ADS)
Mills, Cameron; Tiwari, Vaibhav; Fairhurst, Stephen
2018-05-01
The observation of gravitational wave signals from binary black hole and binary neutron star mergers has established the field of gravitational wave astronomy. It is expected that future networks of gravitational wave detectors will possess great potential in probing various aspects of astronomy. An important consideration for successive improvement of current detectors or establishment on new sites is knowledge of the minimum number of detectors required to perform precision astronomy. We attempt to answer this question by assessing the ability of future detector networks to detect and localize binary neutron stars mergers on the sky. Good localization ability is crucial for many of the scientific goals of gravitational wave astronomy, such as electromagnetic follow-up, measuring the properties of compact binaries throughout cosmic history, and cosmology. We find that although two detectors at improved sensitivity are sufficient to get a substantial increase in the number of observed signals, at least three detectors of comparable sensitivity are required to localize majority of the signals, typically to within around 10 deg2 —adequate for follow-up with most wide field of view optical telescopes.
A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume II.
1982-04-01
Technical Center. The report was divided into two parts: Part I described the improved technology investigated to upgrade existin methods for testing...proper implementation of the computerized data acquisition and reduction programs will improve materials hazards measurement precision. Thus, other...the hold chamber before and after injection of a sample, will improve precision and repeatability of measurement. The listed data acquisition and
Cognition-Based Approaches for High-Precision Text Mining
ERIC Educational Resources Information Center
Shannon, George John
2017-01-01
This research improves the precision of information extraction from free-form text via the use of cognitive-based approaches to natural language processing (NLP). Cognitive-based approaches are an important, and relatively new, area of research in NLP and search, as well as linguistics. Cognitive approaches enable significant improvements in both…
PV cells electrical parameters measurement
NASA Astrophysics Data System (ADS)
Cibira, Gabriel
2017-12-01
When measuring optical parameters of a photovoltaic silicon cell, precise results bring good electrical parameters estimation, applying well-known physical-mathematical models. Nevertheless, considerable re-combination phenomena might occur in both surface and intrinsic thin layers within novel materials. Moreover, rear contact surface parameters may influence close-area re-combination phenomena, too. Therefore, the only precise electrical measurement approach is to prove assumed cell electrical parameters. Based on theoretical approach with respect to experiments, this paper analyses problems within measurement procedures and equipment used for electrical parameters acquisition within a photovoltaic silicon cell, as a case study. Statistical appraisal quality is contributed.
Quadratic electroweak corrections for polarized Moller scattering
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Aleksejevs, S. Barkanova, Y. Kolomensky, E. Kuraev, V. Zykunov
2012-01-01
The paper discusses the two-loop (NNLO) electroweak radiative corrections to the parity violating electron-electron scattering asymmetry induced by squaring one-loop diagrams. The calculations are relevant for the ultra-precise 11 GeV MOLLER experiment planned at Jefferson Laboratory and experiments at high-energy future electron colliders. The imaginary parts of the amplitudes are taken into consideration consistently in both the infrared-finite and divergent terms. The size of the obtained partial correction is significant, which indicates a need for a complete study of the two-loop electroweak radiative corrections in order to meet the precision goals of future experiments.
Vicini, P; Fields, O; Lai, E; Litwack, E D; Martin, A-M; Morgan, T M; Pacanowski, M A; Papaluca, M; Perez, O D; Ringel, M S; Robson, M; Sakul, H; Vockley, J; Zaks, T; Dolsten, M; Søgaard, M
2016-02-01
High throughput molecular and functional profiling of patients is a key driver of precision medicine. DNA and RNA characterization has been enabled at unprecedented cost and scale through rapid, disruptive progress in sequencing technology, but challenges persist in data management and interpretation. We analyze the state-of-the-art of large-scale unbiased sequencing in drug discovery and development, including technology, application, ethical, regulatory, policy and commercial considerations, and discuss issues of LUS implementation in clinical and regulatory practice. © 2015 American Society for Clinical Pharmacology and Therapeutics.
3D display considerations for rugged airborne environments
NASA Astrophysics Data System (ADS)
Barnidge, Tracy J.; Tchon, Joseph L.
2015-05-01
The KC-46 is the next generation, multi-role, aerial refueling tanker aircraft being developed by Boeing for the United States Air Force. Rockwell Collins has developed the Remote Vision System (RVS) that supports aerial refueling operations under a variety of conditions. The system utilizes large-area, high-resolution 3D displays linked with remote sensors to enhance the operator's visual acuity for precise aerial refueling control. This paper reviews the design considerations, trade-offs, and other factors related to the selection and ruggedization of the 3D display technology for this military application.
Wang, Qiang; Liu, Yuefei; Chen, Yiqiang; Ma, Jing; Tan, Liying; Yu, Siyuan
2017-03-01
Accurate location computation for a beacon is an important factor of the reliability of satellite optical communications. However, location precision is generally limited by the resolution of CCD. How to improve the location precision of a beacon is an important and urgent issue. In this paper, we present two precise centroid computation methods for locating a beacon in satellite optical communications. First, in terms of its characteristics, the beacon is divided into several parts according to the gray gradients. Afterward, different numbers of interpolation points and different interpolation methods are applied in the interpolation area; we calculate the centroid position after interpolation and choose the best strategy according to the algorithm. The method is called a "gradient segmentation interpolation approach," or simply, a GSI (gradient segmentation interpolation) algorithm. To take full advantage of the pixels of the beacon's central portion, we also present an improved segmentation square weighting (SSW) algorithm, whose effectiveness is verified by the simulation experiment. Finally, an experiment is established to verify GSI and SSW algorithms. The results indicate that GSI and SSW algorithms can improve locating accuracy over that calculated by a traditional gray centroid method. These approaches help to greatly improve the location precision for a beacon in satellite optical communications.
Use of controlled vocabularies to improve biomedical information retrieval tasks.
Pasche, Emilie; Gobeill, Julien; Vishnyakova, Dina; Ruch, Patrick; Lovis, Christian
2013-01-01
The high heterogeneity of biomedical vocabulary is a major obstacle for information retrieval in large biomedical collections. Therefore, using biomedical controlled vocabularies is crucial for managing these contents. We investigate the impact of query expansion based on controlled vocabularies to improve the effectiveness of two search engines. Our strategy relies on the enrichment of users' queries with additional terms, directly derived from such vocabularies applied to infectious diseases and chemical patents. We observed that query expansion based on pathogen names resulted in improvements of the top-precision of our first search engine, while the normalization of diseases degraded the top-precision. The expansion of chemical entities, which was performed on the second search engine, positively affected the mean average precision. We have shown that query expansion of some types of biomedical entities has a great potential to improve search effectiveness; therefore a fine-tuning of query expansion strategies could help improving the performances of search engines.
Leonardi, Maria Cristina; Ricotti, Rosalinda; Dicuonzo, Samantha; Cattani, Federica; Morra, Anna; Dell'Acqua, Veronica; Orecchia, Roberto; Jereczek-Fossa, Barbara Alicja
2016-10-01
Radiotherapy improves local control in breast cancer (BC) patients which increases overall survival in the long term. Improvements in treatment planning and delivery and a greater understanding of BC behaviour have laid the groundwork for high-precision radiotherapy, which is bound to further improve the therapeutic index. Precise identification of target volumes, better coverage and dose homogeneity have had a positive impact on toxicity and local control. The conformity of treatment dose due to three-dimensional radiotherapy and new techniques such as intensity modulated radiotherapy makes it possible to spare surrounding normal tissue. The widespread use of dose-volume constraints and histograms have increased awareness of toxicity. Real time image guidance has improved geometric precision and accuracy, together with the implementation of quality assurance programs. Advances in the precision of radiotherapy is also based on the choice of the appropriate fractionation and approach. Adaptive radiotherapy is not only a technical concept, but is also a biological concept based on the knowledge that different types of BC have distinctive patterns of locoregional spread. A greater understanding of cancer biology helps in choosing the treatment best suited to a particular situation. Biomarkers predictive of response play a crucial role. The combination of radiotherapy with molecular targeted therapies may enhance radiosensitivity, thus increasing the cytotoxic effects and improving treatment response. The appropriateness of an alternative fractionation, partial breast irradiation, dose escalating/de-escalating approaches, the extent of nodal irradiation have been examined for all the BC subtypes. The broadened concept of adaptive radiotherapy is vital to high-precision treatments. Copyright © 2016 Elsevier Ltd. All rights reserved.
Venne, Gabriel; Rasquinha, Brian J; Pichora, David; Ellis, Randy E; Bicknell, Ryan
2015-07-01
Preoperative planning and intraoperative navigation technologies have each been shown separately to be beneficial for optimizing screw and baseplate positioning in reverse shoulder arthroplasty (RSA) but to date have not been combined. This study describes development of a system for performing computer-assisted RSA glenoid baseplate and screw placement, including preoperative planning, intraoperative navigation, and postoperative evaluation, and compares this system with a conventional approach. We used a custom-designed system allowing computed tomography (CT)-based preoperative planning, intraoperative navigation, and postoperative evaluation. Five orthopedic surgeons defined common preoperative plans on 3-dimensional CT reconstructed cadaveric shoulders. Each surgeon performed 3 computer-assisted and 3 conventional simulated procedures. The 3-dimensional CT reconstructed postoperative units were digitally matched to the preoperative model for evaluation of entry points, end points, and angulations of screws and baseplate. Values were used to find accuracy and precision of the 2 groups with respect to the defined placement. Statistical analysis was performed by t tests (α = .05). Comparison of the groups revealed no difference in accuracy or precision of screws or baseplate entry points (P > .05). Accuracy and precision were improved with use of navigation for end points and angulations of 3 screws (P < .05). Accuracy of the inferior screw showed a trend of improvement with navigation (P > .05). Navigated baseplate end point precision was improved (P < .05), with a trend toward improved accuracy (P > .05). We conclude that CT-based preoperative planning and intraoperative navigation allow improved accuracy and precision for screw placement and precision for baseplate positioning with respect to a predefined placement compared with conventional techniques in RSA. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
Defense Acquisitions: Assessments of Selected Weapon Programs
2017-03-01
PAC-3 MSE) 81 Warfighter Information Network-Tactical (WIN-T) Increment 2 83 Improved Turbine Engine Program (ITEP) 85 Long Range Precision Fires...Unmanned Air System 05/2018 —- O Joint Surveillance Target Attack Radar System Recapitalization 10/2017 —- O Improved Turbine Engine Program TBD...Network-Tactical (WIN-T) Increment 2 83 1-page assessments Improved Turbine Engine Program (ITEP) 85 Long Range Precision Fires (LRPF) 86
High precision locating control system based on VCM for Talbot lithography
NASA Astrophysics Data System (ADS)
Yao, Jingwei; Zhao, Lixin; Deng, Qian; Hu, Song
2016-10-01
Aiming at the high precision and efficiency requirements of Z-direction locating in Talbot lithography, a control system based on Voice Coil Motor (VCM) was designed. In this paper, we built a math model of VCM and its moving characteristic was analyzed. A double-closed loop control strategy including position loop and current loop were accomplished. The current loop was implemented by driver, in order to achieve the rapid follow of the system current. The position loop was completed by the digital signal processor (DSP) and the position feedback was achieved by high precision linear scales. Feed forward control and position feedback Proportion Integration Differentiation (PID) control were applied in order to compensate for dynamic lag and improve the response speed of the system. And the high precision and efficiency of the system were verified by simulation and experiments. The results demonstrated that the performance of Z-direction gantry was obviously improved, having high precision, quick responses, strong real-time and easily to expend for higher precision.
Akamatsu, Tadashi; Hanai, Ushio; Nakajima, Serina; Kobayashi, Megumi; Miyasaka, Muneo; Matsuda, Shinichi; Ikegami, Mariko
2015-06-20
We report a case of lip repair surgery performed for bilateral cleft lip and palate in a patient diagnosed with trisomy 13 and holoprosencephaly. At the age of 2 years and 7 months, the surgery was performed using a modified De Hann design under general anesthesia. The operation was completed in 1 h and 21 min without large fluctuations in the child's general condition. The precise measurement of the intraoperative design was omitted, and the operation was completed using minimal skin sutures. It is possible to perform less-invasive and short surgical procedures after careful consideration during the preoperative planning. Considering the recent improvements in the life expectancy of patients with trisomy 13, we conclude that surgical treatments for non-life threatening malformations such as cleft lip and palate should be performed for such patients.
Considerations for setting the specifications of vaccines.
Minor, Philip
2012-05-01
The specifications of vaccines are determined by the particular product and its method of manufacture, which raise issues unique to the vaccine in question. However, the general principles are shared, including the need to have sufficient active material to immunize a very high proportion of recipients, an acceptable level of safety, which may require specific testing or may come from the production process, and an acceptable low level of contamination with unwanted materials, which may include infectious agents or materials used in production. These principles apply to the earliest smallpox vaccines and the most recent recombinant vaccines, such as those against HPV. Manufacturing development includes more precise definitions of the product through improved tests and tighter control of the process parameters. Good manufacturing practice plays a major role, which is likely to increase in importance in assuring product quality almost independent of end-product specifications.
Engineering half-Heusler thermoelectric materials using Zintl chemistry
NASA Astrophysics Data System (ADS)
Zeier, Wolfgang G.; Schmitt, Jennifer; Hautier, Geoffroy; Aydemir, Umut; Gibbs, Zachary M.; Felser, Claudia; Snyder, G. Jeffrey
2016-06-01
Half-Heusler compounds based on XNiSn and XCoSb (X = Ti, Zr or Hf) have rapidly become important thermoelectric materials for converting waste heat into electricity. In this Review, we provide an overview on the electronic properties of half-Heusler compounds in an attempt to understand their basic structural chemistry and physical properties, and to guide their further development. Half-Heusler compounds can exhibit semiconducting transport behaviour even though they are described as ‘intermetallic’ compounds. Therefore, it is most useful to consider these systems as rigid-band semiconductors within the framework of Zintl (or valence-precise) compounds. These considerations aid our understanding of their properties, such as the bandgap and low hole mobility because of interstitial Ni defects in XNiSn. Understanding the structural and bonding characteristics, including the presence of defects, will help to develop different strategies to improve and design better half-Heusler thermoelectric materials.
Accuracy optimization with wavelength tunability in overlay imaging technology
NASA Astrophysics Data System (ADS)
Lee, Honggoo; Kang, Yoonshik; Han, Sangjoon; Shim, Kyuchan; Hong, Minhyung; Kim, Seungyoung; Lee, Jieun; Lee, Dongyoung; Oh, Eungryong; Choi, Ahlin; Kim, Youngsik; Marciano, Tal; Klein, Dana; Hajaj, Eitan M.; Aharon, Sharon; Ben-Dov, Guy; Lilach, Saltoun; Serero, Dan; Golotsvan, Anna
2018-03-01
As semiconductor manufacturing technology progresses and the dimensions of integrated circuit elements shrink, overlay budget is accordingly being reduced. Overlay budget closely approaches the scale of measurement inaccuracies due to both optical imperfections of the measurement system and the interaction of light with geometrical asymmetries of the measured targets. Measurement inaccuracies can no longer be ignored due to their significant effect on the resulting device yield. In this paper we investigate a new approach for imaging based overlay (IBO) measurements by optimizing accuracy rather than contrast precision, including its effect over the total target performance, using wavelength tunable overlay imaging metrology. We present new accuracy metrics based on theoretical development and present their quality in identifying the measurement accuracy when compared to CD-SEM overlay measurements. The paper presents the theoretical considerations and simulation work, as well as measurement data, for which tunability combined with the new accuracy metrics is shown to improve accuracy performance.
A knowledge network for a dynamic taxonomy of psychiatric disease.
Krishnan, Ranga R
2015-03-01
Current taxonomic approaches in medicine and psychiatry are limited in validity and utility. They do serve simple communication purposes for medical coding, teaching, and reimbursement, but they are not suited for the modern era with its rapid explosion of knowledge from the "omics" revolution. The National Academy of Sciences published a report entitled Toward Precision Medicine: Building a Knowledge Network for Biomedical Research and a New Taxonomy of Disease. The authors advocate a new taxonomy that would integrate molecular data, clinical data, and health outcomes in a dynamic, iterative fashion, bringing together research, public health, and health-care delivery with the interlinked goals of advancing our understanding of disease pathogenesis and thereby improving health. As the need for an information hub and a knowledge network with a dynamic taxonomy based on integration of clinical and research data is vital, and timely, this proposal merits consideration.
3D sensor placement strategy using the full-range pheromone ant colony system
NASA Astrophysics Data System (ADS)
Shuo, Feng; Jingqing, Jia
2016-07-01
An optimized sensor placement strategy will be extremely beneficial to ensure the safety and cost reduction considerations of structural health monitoring (SHM) systems. The sensors must be placed such that important dynamic information is obtained and the number of sensors is minimized. The practice is to select individual sensor directions by several 1D sensor methods and the triaxial sensors are placed in these directions for monitoring. However, this may lead to non-optimal placement of many triaxial sensors. In this paper, a new method, called FRPACS, is proposed based on the ant colony system (ACS) to solve the optimal placement of triaxial sensors. The triaxial sensors are placed as single units in an optimal fashion. And then the new method is compared with other algorithms using Dalian North Bridge. The computational precision and iteration efficiency of the FRPACS has been greatly improved compared with the original ACS and EFI method.
Evaluation of spatial, radiometric and spectral thematic mapper performance for coastal studies
NASA Technical Reports Server (NTRS)
Klemas, V.
1985-01-01
The main emphasis of the research was to determine what effect different wetland plant canopies would have upon observed reflectance in Thematic Mapper bands. The three major vegetation canopy types (broadleaf, gramineous and leafless) produce unique spectral responses for a similar quantity of live biomass. Biomass estimates computed from spectral data were most similar to biomass estimates determined from harvest data when models developed for a specific canopy were used. In other words, the spectral biomass estimate of a broadleaf canopy was most similar to the harvest biomass estimate when a broadleaf canopy radiance model was used. Work is continuing to more precisely determine regression coefficients for each canopy type and to model the change in the coefficients with various combinations of canopy types. Researchers suspect that textural and spatial considerations can be used to identify canopy types and improve biomass estimates from Thematic Mapper data.
Estimating monthly streamflow values by cokriging
Solow, A.R.; Gorelick, S.M.
1986-01-01
Cokriging is applied to estimation of missing monthly streamflow values in three records from gaging stations in west central Virginia. Missing values are estimated from optimal consideration of the pattern of auto- and cross-correlation among standardized residual log-flow records. Investigation of the sensitivity of estimation to data configuration showed that when observations are available within two months of a missing value, estimation is improved by accounting for correlation. Concurrent and lag-one observations tend to screen the influence of other available observations. Three models of covariance structure in residual log-flow records are compared using cross-validation. Models differ in how much monthly variation they allow in covariance. Precision of estimation, reflected in mean squared error (MSE), proved to be insensitive to this choice. Cross-validation is suggested as a tool for choosing an inverse transformation when an initial nonlinear transformation is applied to flow values. ?? 1986 Plenum Publishing Corporation.
NASA Astrophysics Data System (ADS)
Donkov, N.; Zykova, A.; Safonov, V.; Kolesnikov, D.; Goncharov, I.; Yakovin, S.; Georgieva, V.
2014-05-01
Hydroxyapatite Ca10(PO4)6(OH)2 (HAp) is a material considered to be used to form structural matrices in the mineral phase of bone, dentin and enamel. HAp ceramic materials and coatings are widely applied in medicine and dentistry because of their ability to increase the tissue response to the implant surface and promote bone ingrowth and osseoconduction processes. The deposition conditions affect considerably the structure and bio-functionality of the HAp coatings. We focused our research on developing deposition methods allowing a precise control of the structure and stoichiometric composition of HAp thin films. We found that the use of O2 as a reactive gas improves the quality of the sputtered hydroxyapatite coatings by resulting in the formation of films of better stoichiometry with a fine crystalline structure.
Bujarski, Spencer; Ray, Lara A.
2016-01-01
In spite of high prevalence and disease burden, scientific consensus on the etiology and treatment of Alcohol Use Disorder (AUD) has yet to be reached. The development and utilization of experimental psychopathology paradigms in the human laboratory represents a cornerstone of AUD research. In this review, we describe and critically evaluate the major experimental psychopathology paradigms developed for AUD, with an emphasis on their implications, strengths, weaknesses, and methodological considerations. Specifically we review alcohol administration, self-administration, cue-reactivity, and stress-reactivity paradigms. We also provide an introduction to the application of experimental psychopathology methods to translational research including genetics, neuroimaging, pharmacological and behavioral treatment development, and translational science. Through refining and manipulating key phenotypes of interest, these experimental paradigms have the potential to elucidate AUD etiological factors, improve the efficiency of treatment developments, and refine treatment targets thus advancing precision medicine. PMID:27266992
Bujarski, Spencer; Ray, Lara A
2016-11-01
In spite of high prevalence and disease burden, scientific consensus on the etiology and treatment of Alcohol Use Disorder (AUD) has yet to be reached. The development and utilization of experimental psychopathology paradigms in the human laboratory represents a cornerstone of AUD research. In this review, we describe and critically evaluate the major experimental psychopathology paradigms developed for AUD, with an emphasis on their implications, strengths, weaknesses, and methodological considerations. Specifically we review alcohol administration, self-administration, cue-reactivity, and stress-reactivity paradigms. We also provide an introduction to the application of experimental psychopathology methods to translational research including genetics, neuroimaging, pharmacological and behavioral treatment development, and translational science. Through refining and manipulating key phenotypes of interest, these experimental paradigms have the potential to elucidate AUD etiological factors, improve the efficiency of treatment developments, and refine treatment targets thus advancing precision medicine. Copyright © 2016 Elsevier Ltd. All rights reserved.
Yousefi-Nooraie, Reza; Irani, Shirin; Mortaz-Hedjri, Soroush; Shakiba, Behnam
2013-10-01
The aim of this study was to compare the performance of three search methods in the retrieval of relevant clinical trials from PubMed to answer specific clinical questions. Included studies of a sample of 100 Cochrane reviews which recorded in PubMed were considered as the reference standard. The search queries were formulated based on the systematic review titles. Precision, recall and number of retrieved records for limiting the results to clinical trial publication type, and using sensitive and specific clinical queries filters were compared. The number of keywords, presence of specific names of intervention and syndrome in the search keywords were used in a model to predict the recalls and precisions. The Clinical queries-sensitive search strategy retrieved the largest number of records (33) and had the highest recall (41.6%) and lowest precision (4.8%). The presence of specific intervention name was the only significant predictor of all recalls and precisions (P = 0.016). The recall and precision of combination of simple clinical search queries and methodological search filters to find clinical trials in various subjects were considerably low. The limit field strategy yielded in higher precision and fewer retrieved records and approximately similar recall, compared with the clinical queries-sensitive strategy. Presence of specific intervention name in the search keywords increased both recall and precision. © 2010 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Bloom, Howard S.; Richburg-Hayes, Lashawn; Black, Alison Rebeck
2007-01-01
This article examines how controlling statistically for baseline covariates, especially pretests, improves the precision of studies that randomize schools to measure the impacts of educational interventions on student achievement. Empirical findings from five urban school districts indicate that (1) pretests can reduce the number of randomized…
A path to precision in the ICU.
Maslove, David M; Lamontagne, Francois; Marshall, John C; Heyland, Daren K
2017-04-03
Precision medicine is increasingly touted as a groundbreaking new paradigm in biomedicine. In the ICU, the complexity and ambiguity of critical illness syndromes have been identified as fundamental justifications for the adoption of a precision approach to research and practice. Inherently protean diseases states such as sepsis and acute respiratory distress syndrome have manifestations that are physiologically and anatomically diffuse, and that fluctuate over short periods of time. This leads to considerable heterogeneity among patients, and conditions in which a "one size fits all" approach to therapy can lead to widely divergent results. Current ICU therapy can thus be seen as imprecise, with the potential to realize substantial gains from the adoption of precision medicine approaches. A number of challenges still face the development and adoption of precision critical care, a transition that may occur incrementally rather than wholesale. This article describes a few concrete approaches to addressing these challenges.First, novel clinical trial designs, including registry randomized controlled trials and platform trials, suggest ways in which conventional trials can be adapted to better accommodate the physiologic heterogeneity of critical illness. Second, beyond the "omics" technologies already synonymous with precision medicine, the data-rich environment of the ICU can generate complex physiologic signatures that could fuel precision-minded research and practice. Third, the role of computing infrastructure and modern informatics methods will be central to the pursuit of precision medicine in the ICU, necessitating close collaboration with data scientists. As work toward precision critical care continues, small proof-of-concept studies may prove useful in highlighting the potential of this approach.
Raymond, Mark R; Clauser, Brian E; Furman, Gail E
2010-10-01
The use of standardized patients to assess communication skills is now an essential part of assessing a physician's readiness for practice. To improve the reliability of communication scores, it has become increasingly common in recent years to use statistical models to adjust ratings provided by standardized patients. This study employed ordinary least squares regression to adjust ratings, and then used generalizability theory to evaluate the impact of these adjustments on score reliability and the overall standard error of measurement. In addition, conditional standard errors of measurement were computed for both observed and adjusted scores to determine whether the improvements in measurement precision were uniform across the score distribution. Results indicated that measurement was generally less precise for communication ratings toward the lower end of the score distribution; and the improvement in measurement precision afforded by statistical modeling varied slightly across the score distribution such that the most improvement occurred in the upper-middle range of the score scale. Possible reasons for these patterns in measurement precision are discussed, as are the limitations of the statistical models used for adjusting performance ratings.
Design considerations for ultra-precision magnetic bearing supported slides
NASA Technical Reports Server (NTRS)
Slocum, Alexander H.; Eisenhaure, David B.
1993-01-01
Development plans for a prototype servocontrolled machine with 1 angstrom resolution of linear motion and 50 mm range of travel are described. Two such devices could then be combined to produce a two dimensional machine for probing large planar objects with atomic resolution, the Angstrom Resolution Measuring Machine (ARMM).
Phonemic Characteristics of Apraxia of Speech Resulting from Subcortical Hemorrhage
ERIC Educational Resources Information Center
Peach, Richard K.; Tonkovich, John D.
2004-01-01
Reports describing subcortical apraxia of speech (AOS) have received little consideration in the development of recent speech processing models because the speech characteristics of patients with this diagnosis have not been described precisely. We describe a case of AOS with aphasia secondary to basal ganglia hemorrhage. Speech-language symptoms…
Contextual Fear Conditioning in Zebrafish
ERIC Educational Resources Information Center
Kenney, Justin W.; Scott, Ian C.; Josselyn, Sheena A.; Frankland, Paul W.
2017-01-01
Zebrafish are a genetically tractable vertebrate that hold considerable promise for elucidating the molecular basis of behavior. Although numerous recent advances have been made in the ability to precisely manipulate the zebrafish genome, much less is known about many aspects of learning and memory in adult fish. Here, we describe the development…
USSR Report, Political and Sociological Affairs, No. 1438.
1983-08-02
organized manner in spite of a considerable increase in the volume of work. Animal productivity was increased. The plan for procurement of meat, milk ...possible to talk about evidence under conditions of blatant falsification of the facts—has been the culture, or more precisely, the literature, of Central
Thinking with Theory in an Era of Trump
ERIC Educational Resources Information Center
Strom, Kathryn J.; Martin, Adrian D.
2017-01-01
This introduction to this special issue on "Thinking with Theory in Teacher Education" dedicates considerable space to broadly discussing the current U.S. political context to emphasize why, at this precise moment in history, educators, teacher educators, and educational researchers are in dire need of different ways to understand the…
Covariate Imbalance and Precision in Measuring Treatment Effects
ERIC Educational Resources Information Center
Liu, Xiaofeng Steven
2011-01-01
Covariate adjustment can increase the precision of estimates by removing unexplained variance from the error in randomized experiments, although chance covariate imbalance tends to counteract the improvement in precision. The author develops an easy measure to examine chance covariate imbalance in randomization by standardizing the average…
Analysis of de-noising methods to improve the precision of the ILSF BPM electronic readout system
NASA Astrophysics Data System (ADS)
Shafiee, M.; Feghhi, S. A. H.; Rahighi, J.
2016-12-01
In order to have optimum operation and precise control system at particle accelerators, it is required to measure the beam position with the precision of sub-μm. We developed a BPM electronic readout system at Iranian Light Source Facility and it has been experimentally tested at ALBA accelerator facility. The results show the precision of 0.54 μm in beam position measurements. To improve the precision of this beam position monitoring system to sub-μm level, we have studied different de-noising methods such as principal component analysis, wavelet transforms, filtering by FIR, and direct averaging method. An evaluation of the noise reduction was given to testify the ability of these methods. The results show that the noise reduction based on Daubechies wavelet transform is better than other algorithms, and the method is suitable for signal noise reduction in beam position monitoring system.
Tarone, Aaron M; Foran, David R
2011-01-01
Forensic entomologists use size and developmental stage to estimate blow fly age, and from those, a postmortem interval. Since such estimates are generally accurate but often lack precision, particularly in the older developmental stages, alternative aging methods would be advantageous. Presented here is a means of incorporating developmentally regulated gene expression levels into traditional stage and size data, with a goal of more precisely estimating developmental age of immature Lucilia sericata. Generalized additive models of development showed improved statistical support compared to models that did not include gene expression data, resulting in an increase in estimate precision, especially for postfeeding third instars and pupae. The models were then used to make blind estimates of development for 86 immature L. sericata raised on rat carcasses. Overall, inclusion of gene expression data resulted in increased precision in aging blow flies. © 2010 American Academy of Forensic Sciences.
Kestens, Yan; Wasfi, Rania; Naud, Alexandre; Chaix, Basile
2017-03-01
The aim of this paper is to review the recent advances in health and place research and discuss concepts useful to explore how context affects health. More specifically, it reviews measures and tools used to account for place; concepts relating to daily mobility and multiple exposure to places, and further points to the intertwining between social and spatial networks to help further our understanding of how context translates into health profiles. Significant advances in environmental or neighborhood effects have been made in the last decades. Specifically, conceptual and methodological developments have improved our consideration of spatial processes, shifting from a residential-based view of context to a more dynamic activity space and daily mobility paradigm. Yet, such advances have led to overlooking other potentially important aspects related to social networks and decision-making processes. With an increasing capacity to collect high-precision data on daily mobility and behavior, new possibilities in understanding how environments relate to behavior and health inequalities arise. Two overlooked aspects need to be addressed: the questions of "with or for whom", and "why". While the former calls for a better consideration of social networks and social interactions, the latter calls for refining our understanding of place preference and decision-making leading to daily mobility and multiple exposures.
Increasing value and reducing waste in research design, conduct, and analysis.
Ioannidis, John P A; Greenland, Sander; Hlatky, Mark A; Khoury, Muin J; Macleod, Malcolm R; Moher, David; Schulz, Kenneth F; Tibshirani, Robert
2014-01-11
Correctable weaknesses in the design, conduct, and analysis of biomedical and public health research studies can produce misleading results and waste valuable resources. Small effects can be difficult to distinguish from bias introduced by study design and analyses. An absence of detailed written protocols and poor documentation of research is common. Information obtained might not be useful or important, and statistical precision or power is often too low or used in a misleading way. Insufficient consideration might be given to both previous and continuing studies. Arbitrary choice of analyses and an overemphasis on random extremes might affect the reported findings. Several problems relate to the research workforce, including failure to involve experienced statisticians and methodologists, failure to train clinical researchers and laboratory scientists in research methods and design, and the involvement of stakeholders with conflicts of interest. Inadequate emphasis is placed on recording of research decisions and on reproducibility of research. Finally, reward systems incentivise quantity more than quality, and novelty more than reliability. We propose potential solutions for these problems, including improvements in protocols and documentation, consideration of evidence from studies in progress, standardisation of research efforts, optimisation and training of an experienced and non-conflicted scientific workforce, and reconsideration of scientific reward systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
A phase match based frequency estimation method for sinusoidal signals
NASA Astrophysics Data System (ADS)
Shen, Yan-Lin; Tu, Ya-Qing; Chen, Lin-Jun; Shen, Ting-Ao
2015-04-01
Accurate frequency estimation affects the ranging precision of linear frequency modulated continuous wave (LFMCW) radars significantly. To improve the ranging precision of LFMCW radars, a phase match based frequency estimation method is proposed. To obtain frequency estimation, linear prediction property, autocorrelation, and cross correlation of sinusoidal signals are utilized. The analysis of computational complex shows that the computational load of the proposed method is smaller than those of two-stage autocorrelation (TSA) and maximum likelihood. Simulations and field experiments are performed to validate the proposed method, and the results demonstrate the proposed method has better performance in terms of frequency estimation precision than methods of Pisarenko harmonic decomposition, modified covariance, and TSA, which contribute to improving the precision of LFMCW radars effectively.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
High-speed laser microsurgery of alert fruit flies for fluorescence imaging of neural activity
Sinha, Supriyo; Liang, Liang; Ho, Eric T. W.; Urbanek, Karel E.; Luo, Liqun; Baer, Thomas M.; Schnitzer, Mark J.
2013-01-01
Intravital microscopy is a key means of monitoring cellular function in live organisms, but surgical preparation of a live animal for microscopy often is time-consuming, requires considerable skill, and limits experimental throughput. Here we introduce a spatially precise (<1-µm edge precision), high-speed (<1 s), largely automated, and economical protocol for microsurgical preparation of live animals for optical imaging. Using a 193-nm pulsed excimer laser and the fruit fly as a model, we created observation windows (12- to 350-µm diameters) in the exoskeleton. Through these windows we used two-photon microscopy to image odor-evoked Ca2+ signaling in projection neuron dendrites of the antennal lobe and Kenyon cells of the mushroom body. The impact of a laser-cut window on fly health appears to be substantially less than that of conventional manual dissection, for our imaging durations of up to 18 h were ∼5–20 times longer than prior in vivo microscopy studies of hand-dissected flies. This improvement will facilitate studies of numerous questions in neuroscience, such as those regarding neuronal plasticity or learning and memory. As a control, we used phototaxis as an exemplary complex behavior in flies and found that laser microsurgery is sufficiently gentle to leave it intact. To demonstrate that our techniques are applicable to other species, we created microsurgical openings in nematodes, ants, and the mouse cranium. In conjunction with emerging robotic methods for handling and mounting flies or other small organisms, our rapid, precisely controllable, and highly repeatable microsurgical techniques should enable automated, high-throughput preparation of live animals for optical experimentation. PMID:24167298
Cost Implications of Value-Based Pricing for Companion Diagnostic Tests in Precision Medicine.
Zaric, Gregory S
2016-07-01
Many interpretations of personalized medicine, also referred to as precision medicine, include discussions of companion diagnostic tests that allow drugs to be targeted to those individuals who are most likely to benefit or that allow treatment to be designed in a way such that individuals who are unlikely to benefit do not receive treatment. Many authors have commented on the clinical and competitive implications of companion diagnostics, but there has been relatively little formal analysis of the cost implications of companion diagnostics, although cost reduction is often cited as a significant benefit of precision medicine. We investigate the potential impact on costs of precision medicine implemented through the use of companion diagnostics. We develop a framework in which the costs of companion diagnostic tests are determined by considerations of profit maximization and cost effectiveness. We analyze four scenarios that are defined by the incremental cost-effectiveness ratio of the new drug in the absence of a companion diagnostic test. We find that, in most scenarios, precision medicine strategies based on companion diagnostics should be expected to lead to increases in costs in the short term and that costs would fall only in a limited number of situations.
What Friends Are For: Collaborative Intelligence Analysis and Search
2014-06-01
14. SUBJECT TERMS Intelligence Community, information retrieval, recommender systems , search engines, social networks, user profiling, Lucene...improvements over existing search systems . The improvements are shown to be robust to high levels of human error and low similarity between users ...precision NOLH nearly orthogonal Latin hypercubes P@ precision at documents RS recommender systems TREC Text REtrieval Conference USM user
Cagliani, Alberto; Østerberg, Frederik W; Hansen, Ole; Shiv, Lior; Nielsen, Peter F; Petersen, Dirch H
2017-09-01
We present a breakthrough in micro-four-point probe (M4PP) metrology to substantially improve precision of transmission line (transfer length) type measurements by application of advanced electrode position correction. In particular, we demonstrate this methodology for the M4PP current-in-plane tunneling (CIPT) technique. The CIPT method has been a crucial tool in the development of magnetic tunnel junction (MTJ) stacks suitable for magnetic random-access memories for more than a decade. On two MTJ stacks, the measurement precision of resistance-area product and tunneling magnetoresistance was improved by up to a factor of 3.5 and the measurement reproducibility by up to a factor of 17, thanks to our improved position correction technique.
Estimating maneuvers for precise relative orbit determination using GPS
NASA Astrophysics Data System (ADS)
Allende-Alba, Gerardo; Montenbruck, Oliver; Ardaens, Jean-Sébastien; Wermuth, Martin; Hugentobler, Urs
2017-01-01
Precise relative orbit determination is an essential element for the generation of science products from distributed instrumentation of formation flying satellites in low Earth orbit. According to the mission profile, the required formation is typically maintained and/or controlled by executing maneuvers. In order to generate consistent and precise orbit products, a strategy for maneuver handling is mandatory in order to avoid discontinuities or precision degradation before, after and during maneuver execution. Precise orbit determination offers the possibility of maneuver estimation in an adjustment of single-satellite trajectories using GPS measurements. However, a consistent formulation of a precise relative orbit determination scheme requires the implementation of a maneuver estimation strategy which can be used, in addition, to improve the precision of maneuver estimates by drawing upon the use of differential GPS measurements. The present study introduces a method for precise relative orbit determination based on a reduced-dynamic batch processing of differential GPS pseudorange and carrier phase measurements, which includes maneuver estimation as part of the relative orbit adjustment. The proposed method has been validated using flight data from space missions with different rates of maneuvering activity, including the GRACE, TanDEM-X and PRISMA missions. The results show the feasibility of obtaining precise relative orbits without degradation in the vicinity of maneuvers as well as improved maneuver estimates that can be used for better maneuver planning in flight dynamics operations.
Patterned retinal coagulation with a scanning laser
NASA Astrophysics Data System (ADS)
Palanker, Daniel; Jain, ATul; Paulus, Yannis; Andersen, Dan; Blumenkranz, Mark S.
2007-02-01
Pan-retinal photocoagulation in patients with diabetic retinopathy typically involves application of more than 1000 laser spots; often resulting in physician fatigue and patient discomfort. We present a semi-automated patterned scanning laser photocoagulator that rapidly applies predetermined patterns of lesions; thus, greatly improving the comfort, efficiency and precision of the treatment. Patterns selected from a graphical user interface are displayed on the retina with an aiming beam, and treatment can be initiated and interrupted by depressing a foot pedal. To deliver a significant number of burns during the eye's fixation time, each pulse should be considerably shorter than conventional 100ms pulse duration. We measured coagulation thresholds and studied clinical and histological outcomes of the application of laser pulses in the range of 1-200ms in pigmented rabbits. Laser power required for producing ophthalmoscopically visible lesions with a laser spot of 132μm decreased from 360 to 37mW with pulse durations increasing from 1 to 100ms. In the range of 10-100ms clinically and histologically equivalent light burns could be produced. The safe therapeutic range of coagulation (ratio of the laser power required to produce a rupture to that for a light burn) decreased with decreasing pulse duration: from 3.8 at 100ms, to 3.0 at 20ms, to 2.5 at 10ms, and to 1.1 at 1ms. Histology demonstrated increased confinement of the thermal damage with shorter pulses, with coagulation zone limited to the photoreceptor layer at pulses shorter than 10ms. Durations of 10-20ms appear to be a good compromise between the speed and safety of retinal coagulation. Rapid application of multiple lesions greatly improves the speed, precision, and reduces pain in retinal photocoagulation.
Second Iteration of Photogrammetric Pipeline to Enhance the Accuracy of Image Pose Estimation
NASA Astrophysics Data System (ADS)
Nguyen, T. G.; Pierrot-Deseilligny, M.; Muller, J.-M.; Thom, C.
2017-05-01
In classical photogrammetric processing pipeline, the automatic tie point extraction plays a key role in the quality of achieved results. The image tie points are crucial to pose estimation and have a significant influence on the precision of calculated orientation parameters. Therefore, both relative and absolute orientations of the 3D model can be affected. By improving the precision of image tie point measurement, one can enhance the quality of image orientation. The quality of image tie points is under the influence of several factors such as the multiplicity, the measurement precision and the distribution in 2D images as well as in 3D scenes. In complex acquisition scenarios such as indoor applications and oblique aerial images, tie point extraction is limited while only image information can be exploited. Hence, we propose here a method which improves the precision of pose estimation in complex scenarios by adding a second iteration to the classical processing pipeline. The result of a first iteration is used as a priori information to guide the extraction of new tie points with better quality. Evaluated with multiple case studies, the proposed method shows its validity and its high potiential for precision improvement.
2014-01-01
Background Genome wide association studies (GWAS) in most cattle breeds result in large genomic intervals of significant associations making it difficult to identify causal mutations. This is due to the extensive, low-level linkage disequilibrium within a cattle breed. As there is less linkage disequilibrium across breeds, multibreed GWAS may improve precision of causal variant mapping. Here we test this hypothesis in a Holstein and Jersey cattle data set with 17,925 individuals with records for production and functional traits and 632,003 SNP markers. Results By using a cross validation strategy within the Holstein and Jersey data sets, we were able to identify and confirm a large number of QTL. As expected, the precision of mapping these QTL within the breeds was limited. In the multibreed analysis, we found that many loci were not segregating in both breeds. This was partly an artefact of power of the experiments, with the number of QTL shared between the breeds generally increasing with trait heritability. False discovery rates suggest that the multibreed analysis was less powerful than between breed analyses, in terms of how much genetic variance was explained by the detected QTL. However, the multibreed analysis could more accurately pinpoint the location of the well-described mutations affecting milk production such as DGAT1. Further, the significant SNP in the multibreed analysis were significantly enriched in genes regions, to a considerably greater extent than was observed in the single breed analyses. In addition, we have refined QTL on BTA5 and BTA19 to very small intervals and identified a small number of potential candidate genes in these, as well as in a number of other regions. Conclusion Where QTL are segregating across breed, multibreed GWAS can refine these to reasonably small genomic intervals. However, such QTL appear to represent only a fraction of the genetic variation. Our results suggest a significant proportion of QTL affecting milk production segregate within rather than across breeds, at least for Holstein and Jersey cattle. PMID:24456127
Acceleration Noise Considerations for Drag-free Satellite Geodesy Missions
NASA Astrophysics Data System (ADS)
Hong, S. H.; Conklin, J. W.
2016-12-01
The GRACE mission, which launched in 2002, opened a new era of satellite geodesy by providing monthly mass variation solutions with spatial resolution of less than 200 km. GRACE proved the usefulness of a low-low satellite-to-satellite tracking formation. Analysis of the GRACE data showed that the K-Band ranging system, which is used to measure the range between the two satellites, is the limiting factor for the precision of the solution. Consequently, the GRACE-FO mission, schedule for launch in 2017, will continue the work of GRACE, but will also test a new, higher precision laser ranging interferometer compared with the K-Band ranging system. Beyond GRACE-FO, drag-free systems are being considered for satellite geodesy missions. GOCE tested a drag-free attitude control system with a gravity gradiometer and showed improvements in the acceleration noise compensation compared to the electrostatic accelerometers used in GRACE. However, a full drag-free control system with a gravitational reference sensor has not yet been applied to satellite geodesy missions. More recently, this type of drag-free system was used in LISA Pathfinder, launched in 2016, with an acceleration noise performance two orders of magnitude better than that of GOCE. We explore the effects of drag-free performance in satellite geodesy missions similar to GRACE-FO by applying three different residual acceleration noises from actual space missions: GRACE, GOCE and LISA Pathfinder. Our solutions are limited to degree 60 spherical harmonic coefficients with biweekly time resolution. Our analysis shows that a drag-free system with acceleration noise performance comparable to GOCE and LISA-Pathfinder would greatly improve the accuracy of gravity solutions. In addition to these results, we also present the covariance shaping process used in the estimation. In the future, we plan to use actual acceleration noise data measured using the UF torsion pendulum. This apparatus is a ground facility at University of Florida used to test the performance of precision inertial sensors. We also plan to evaluate the importance of acceleration noise when a second inclined pair of satellites is included in the analysis, following the work of Weise in 2012, which showed that two satellite pairs decreased aliasing errors.
The use of secondary ion mass spectrometry in forensic analyses of ultra-small samples
NASA Astrophysics Data System (ADS)
Cliff, John
2010-05-01
It is becoming increasingly important in forensic science to perform chemical and isotopic analyses on very small sample sizes. Moreover, in some instances the signature of interest may be incorporated in a vast background making analyses impossible by bulk methods. Recent advances in instrumentation make secondary ion mass spectrometry (SIMS) a powerful tool to apply to these problems. As an introduction, we present three types of forensic analyses in which SIMS may be useful. The causal organism of anthrax (Bacillus anthracis) chelates Ca and other metals during spore formation. Thus, the spores contain a trace element signature related to the growth medium that produced the organisms. Although other techniques have been shown to be useful in analyzing these signatures, the sample size requirements are generally relatively large. We have shown that time of flight SIMS (TOF-SIMS) combined with multivariate analysis, can clearly separate Bacillus sp. cultures prepared in different growth media using analytical spot sizes containing approximately one nanogram of spores. An important emerging field in forensic analysis is that of provenance of fecal pollution. The strategy of choice for these analyses-developing host-specific nucleic acid probes-has met with considerable difficulty due to lack of specificity of the probes. One potentially fruitful strategy is to combine in situ nucleic acid probing with high precision isotopic analyses. Bulk analyses of human and bovine fecal bacteria, for example, indicate a relative difference in d13C content of about 4 per mil. We have shown that sample sizes of several nanograms can be analyzed with the IMS 1280 with precisions capable of separating two per mil differences in d13C. The NanoSIMS 50 is capable of much better spatial resolution than the IMS 1280, albeit at a cost of analytical precision. Nevertheless we have documented precision capable of separating five per mil differences in d13C using analytical spots containing less than 300 picograms of bacteria. Perhaps the most successful application of SIMS for forensic purposes to date is in the field of nuclear forensics. An example that has been used by laboratories associated with the International Atomic Energy Agency is the examination of environmental samples for enriched uranium particles indicative of clandestine weapons production activities.. The analytical challenge in these types of measurements is to search complex environmental matrices for U-bearing particles which must then be analyzed for 234U, 235U, and 236U content with high precision and accuracy. Older-generation SIMS instruments were hampered by small geometries that made resolution of significant interferences problematic. In addition, automated particle search software was proprietary and difficult to obtain. With the development of new search software, the IMS 1280 is capable of searching a sample in a matter of hours, flagging U-bearing particles for later analyses, and providing a rough 235U content. Particles of interest can be revisited for high precision analyses, and all U-isotopes can be measured simultaneously in multicollector mode, dramatically improving analysis time and internal precision. Further, the large geometry of the instrument allows complete resolution of isobaric interferences that have traditionally limited SIMS analyses of difficult samples. Examples of analyses of micron-sized standard particles indicate that estimates of 235U enrichment can be obtained with an external relative precision of 0.1% and 234U and 236U contents can be obtained with a relative precision of less than 1%. Analyses of 'real' samples show a dramatic improvement in the data quality obtained compared with small-geometry SIMS instruments making SIMS the method of choice for these high-profile samples when accurate, precise, and rapid results are required.
Kempka, Martin; Sjödahl, Johan; Björk, Anders; Roeraade, Johan
2004-01-01
A method for peak picking for matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOFMS) is described. The method is based on the assumption that two sets of ions are formed during the ionization stage, which have Gaussian distributions but different velocity profiles. This gives rise to a certain degree of peak skewness. Our algorithm deconvolutes the peak and utilizes the fast velocity, bulk ion distribution for peak picking. Evaluation of the performance of the new method was conducted using peptide peaks from a bovine serum albumin (BSA) digest, and compared with the commercial peak-picking algorithms Centroid and SNAP. When using the new two-Gaussian algorithm, for strong signals the mass accuracy was equal to or marginally better than the results obtained from the commercial algorithms. However, for weak, distorted peaks, considerable improvement in both mass accuracy and precision was obtained. This improvement should be particularly useful in proteomics, where a lack of signal strength is often encountered when dealing with weakly expressed proteins. Finally, since the new peak-picking method uses information from the entire signal, no adjustments of parameters related to peak height have to be made, which simplifies its practical use. Copyright 2004 John Wiley & Sons, Ltd.
Magnan, Morris A; Maklebust, Joann
2008-01-01
To evaluate the effect of Web-based Braden Scale training on the reliability and precision of pressure ulcer risk assessments made by registered nurses (RN) working in acute care settings. Pretest-posttest, 2-group, quasi-experimental design. Five hundred Braden Scale risk assessments were made on 102 acute care patients deemed to be at various levels of risk for pressure ulceration. Assessments were made by RNs working in acute care hospitals at 3 different medical centers where the Braden Scale was in regular daily use (2 medical centers) or new to the setting (1 medical center). The Braden Scale for Predicting Pressure Sore Risk was used to guide pressure ulcer risk assessments. A Web-based version of the Detroit Medical Center Braden Scale Computerized Training Module was used to teach nurses correct use of the Braden Scale and selection of risk-based pressure ulcer prevention interventions. In the aggregate, RN generated reliable Braden Scale pressure ulcer risk assessments 65% of the time after training. The effect of Web-based Braden Scale training on reliability and precision of assessments varied according to familiarity with the scale. With training, new users of the scale made reliable assessments 84% of the time and significantly improved precision of their assessments. The reliability and precision of Braden Scale risk assessments made by its regular users was unaffected by training. Technology-assisted Braden Scale training improved both reliability and precision of risk assessments made by new users of the scale, but had virtually no effect on the reliability or precision of risk assessments made by regular users of the instrument. Further research is needed to determine best approaches for improving reliability and precision of Braden Scale assessments made by its regular users.
Composite adaptive control of belt polishing force for aero-engine blade
NASA Astrophysics Data System (ADS)
Zhsao, Pengbing; Shi, Yaoyao
2013-09-01
The existing methods for blade polishing mainly focus on robot polishing and manual grinding. Due to the difficulty in high-precision control of the polishing force, the blade surface precision is very low in robot polishing, in particular, quality of the inlet and exhaust edges can not satisfy the processing requirements. Manual grinding has low efficiency, high labor intensity and unstable processing quality, moreover, the polished surface is vulnerable to burn, and the surface precision and integrity are difficult to ensure. In order to further improve the profile accuracy and surface quality, a pneumatic flexible polishing force-exerting mechanism is designed and a dual-mode switching composite adaptive control(DSCAC) strategy is proposed, which combines Bang-Bang control and model reference adaptive control based on fuzzy neural network(MRACFNN) together. By the mode decision-making mechanism, Bang-Bang control is used to track the control command signal quickly when the actual polishing force is far away from the target value, and MRACFNN is utilized in smaller error ranges to improve the system robustness and control precision. Based on the mathematical model of the force-exerting mechanism, simulation analysis is implemented on DSCAC. Simulation results show that the output polishing force can better track the given signal. Finally, the blade polishing experiments are carried out on the designed polishing equipment. Experimental results show that DSCAC can effectively mitigate the influence of gas compressibility, valve dead-time effect, valve nonlinear flow, cylinder friction, measurement noise and other interference on the control precision of polishing force, which has high control precision, strong robustness, strong anti-interference ability and other advantages compared with MRACFNN. The proposed research achieves high-precision control of the polishing force, effectively improves the blade machining precision and surface consistency, and significantly reduces the surface roughness.
Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Precision medicine in cardiology.
Antman, Elliott M; Loscalzo, Joseph
2016-10-01
The cardiovascular research and clinical communities are ideally positioned to address the epidemic of noncommunicable causes of death, as well as advance our understanding of human health and disease, through the development and implementation of precision medicine. New tools will be needed for describing the cardiovascular health status of individuals and populations, including 'omic' data, exposome and social determinants of health, the microbiome, behaviours and motivations, patient-generated data, and the array of data in electronic medical records. Cardiovascular specialists can build on their experience and use precision medicine to facilitate discovery science and improve the efficiency of clinical research, with the goal of providing more precise information to improve the health of individuals and populations. Overcoming the barriers to implementing precision medicine will require addressing a range of technical and sociopolitical issues. Health care under precision medicine will become a more integrated, dynamic system, in which patients are no longer a passive entity on whom measurements are made, but instead are central stakeholders who contribute data and participate actively in shared decision-making. Many traditionally defined diseases have common mechanisms; therefore, elimination of a siloed approach to medicine will ultimately pave the path to the creation of a universal precision medicine environment.
Andritzky, Juliane; Rossol, Melanie; Lischer, Christoph; Auer, Joerg A
2005-01-01
To compare the precision obtained with computer-assisted screw insertion for treatment of mid-sagittal articular fractures of the distal phalanx (P3) with results achieved with a conventional technique. In vitro experimental study. Thirty-two cadaveric equine limbs. Four groups of 8 limbs were studied. Either 1 or 2 screws were inserted perpendicular to an imaginary axial fracture of P3 using computer-assisted surgery (CAS) or conventional technique. Screw insertion time, predetermined screw length, inserted screw length, fit of the screw, and errors in placement were recorded. CAS technique took 15-20 minutes longer but resulted in greater precision of screw length and placement compared with the conventional technique. Improved precision in screw insertion with CAS makes insertion of 2 screws possible for repair of mid-sagittal P3 fractures. CAS although expensive improves precision in screw insertion into P3 and consequently should yield improved clinical outcome.
Safe teleoperation based on flexible intraoperative planning for robot-assisted laser microsurgery.
Mattos, Leonardo S; Caldwell, Darwin G
2012-01-01
This paper describes a new intraoperative planning system created to improve precision and safety in teleoperated laser microsurgeries. It addresses major safety issues related to real-time control of a surgical laser during teleoperated procedures, which are related to the reliability and robustness of the telecommunication channels. Here, a safe solution is presented, consisting in a new planning system architecture that maintains the flexibility and benefits of real-time teleoperation and keeps the surgeon in control of all surgical actions. The developed system is based on our virtual scalpel system for robot-assisted laser microsurgery, and allows the intuitive use of stylus to create surgical plans directly over live video of the surgical field. In this case, surgical plans are defined as graphic objects overlaid on the live video, which can be easily modified or replaced as needed, and which are transmitted to the main surgical system controller for subsequent safe execution. In the process of improving safety, this new planning system also resulted in improved laser aiming precision and improved capability for higher quality laser procedures, both due to the new surgical plan execution module, which allows very fast and precise laser aiming control. Experimental results presented herein show that, in addition to the safety improvements, the new planning system resulted in a 48% improvement in laser aiming precision when compared to the previous virtual scalpel system.
Experimental considerations for testing antimatter antigravity using positronium 1S-2S spectroscopy
NASA Astrophysics Data System (ADS)
Crivelli, P.; Cooke, D. A.; Friedreich, S.
2014-05-01
In this contribution to the WAG 2013 workshop we report on the status of our measurement of the 1S-2S transition frequency of positronium. The aim of this experiment is to reach a precision of 0.5 ppb in order to cross check the QED calculations. After reviewing the current available sources of Ps, we consider laser cooling as a route to push the precision in the measurement down to 0.1 ppb. If such an uncertainty could be achieved, this would be sensitive to the gravitational redshift and therefore be able to assess the sign of gravity for antimatter.
Fiber-optic two-photon optogenetic stimulation.
Dhakal, K; Gu, L; Black, B; Mohanty, S K
2013-06-01
Optogenetic stimulation of genetically targeted cells is proving to be a powerful tool in the study of cellular systems, both in vitro and in vivo. However, most opsins are activated in the visible spectrum, where significant absorption and scattering of stimulating light occurs, leading to low penetration depth and less precise stimulation. Since we first (to the best of our knowledge) demonstrated two-photon optogenetic stimulation (TPOS), it has gained considerable interest in the probing of cellular circuitry by precise spatial modulation. However, all existing methods use microscope objectives and complex scanning beam geometries. Here, we report a nonscanning method based on multimode fiber to accomplish fiber-optic TPOS of cells.
NASA three-laser airborne differential absorption lidar system electronics
NASA Technical Reports Server (NTRS)
Allen, R. J.; Copeland, G. D.
1984-01-01
The system control and signal conditioning electronics of the NASA three laser airborne differential absorption lidar (DIAL) system are described. The multipurpose DIAL system was developed for the remote measurement of gas and aerosol profiles in the troposphere and lower stratosphere. A brief description and photographs of the majority of electronics units developed under this contract are presented. The precision control system; which includes a master control unit, three combined NASA laser control interface/quantel control units, and three noise pulse discriminator/pockels cell pulser units; is described in detail. The need and design considerations for precision timing and control are discussed. Calibration procedures are included.
Lombaert, Herve; Grady, Leo; Polimeni, Jonathan R.; Cheriet, Farida
2013-01-01
Existing methods for surface matching are limited by the trade-off between precision and computational efficiency. Here we present an improved algorithm for dense vertex-to-vertex correspondence that uses direct matching of features defined on a surface and improves it by using spectral correspondence as a regularization. This algorithm has the speed of both feature matching and spectral matching while exhibiting greatly improved precision (distance errors of 1.4%). The method, FOCUSR, incorporates implicitly such additional features to calculate the correspondence and relies on the smoothness of the lowest-frequency harmonics of a graph Laplacian to spatially regularize the features. In its simplest form, FOCUSR is an improved spectral correspondence method that nonrigidly deforms spectral embeddings. We provide here a full realization of spectral correspondence where virtually any feature can be used as additional information using weights on graph edges, but also on graph nodes and as extra embedded coordinates. As an example, the full power of FOCUSR is demonstrated in a real case scenario with the challenging task of brain surface matching across several individuals. Our results show that combining features and regularizing them in a spectral embedding greatly improves the matching precision (to a sub-millimeter level) while performing at much greater speed than existing methods. PMID:23868776
Automated semantic indexing of figure captions to improve radiology image retrieval.
Kahn, Charles E; Rubin, Daniel L
2009-01-01
We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Estimated precision was 0.897 (95% confidence interval, 0.857-0.937). Estimated recall was 0.930 (95% confidence interval, 0.838-1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval.
Influence in Action in "Catch Me if You Can"
ERIC Educational Resources Information Center
Meyer, Gary; Roberto, Anthony J.
2005-01-01
For decades, scholars have worked to understand the precise manner in which messages affect attitudes and ultimately behaviors. The dominant paradigm suggests that there are two methods or routes to attitude change, one based on careful consideration of the messages and the other based on simple decision rules, often referred to as heuristics…
Precision of Curriculum-Based Measurement Reading Data: Considerations for Multiple-Baseline Designs
ERIC Educational Resources Information Center
Klingbeil, David A.; Van Norman, Ethan R.; Nelson, Peter M.
2017-01-01
Single-case designs provide an established technology for evaluating the effects of academic interventions. Researchers interested in studying the long-term effects of reading interventions often use curriculum-based measures of reading (CBM-R) as they possess many of the desirable characteristics for use in a time-series design. The reliability…
Simulation of Aphasic Naming Performance in Non-Brain-Damaged Adults
ERIC Educational Resources Information Center
Silkes, JoAnn P.; McNeil, Malcolm R.; Drton, Mathias
2004-01-01
Discussion abounds in the literature as to whether aphasia is a deficit of linguistic competence or linguistic performance and, if it is a performance deficit, what are its precise mechanisms. Considerable evidence suggests that alteration of nonlinguistic factors can affect language performance in aphasia, a finding that raises questions about…
Supporting Students: The Foundation of Guidance in the Classroom
ERIC Educational Resources Information Center
Bilac, Sanja
2012-01-01
Why is it important to provide support to students in the classroom? It is an empirical fact that precisely this part of the "internal school life" considerably influences the positive development and satisfaction of the school. Some anecdotes from school remain merely memories, while others become values that are carefully protected one's whole…
33 CFR 238.7 - Decision criteria for participation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... upstream of the precise point where Federal flood control authorities become applicable. (b) Storm sewer... will be considered to be a part of local storm drainage to be addressed as part of the consideration of an adequate storm sewer system. The purpose of this system is to collect and convey to a natural...
33 CFR 238.7 - Decision criteria for participation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... upstream of the precise point where Federal flood control authorities become applicable. (b) Storm sewer... will be considered to be a part of local storm drainage to be addressed as part of the consideration of an adequate storm sewer system. The purpose of this system is to collect and convey to a natural...
33 CFR 238.7 - Decision criteria for participation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... upstream of the precise point where Federal flood control authorities become applicable. (b) Storm sewer... will be considered to be a part of local storm drainage to be addressed as part of the consideration of an adequate storm sewer system. The purpose of this system is to collect and convey to a natural...
The Oil Drop Experiment: Do Physical Chemistry Textbooks Refer to Its Controversial Nature?
ERIC Educational Resources Information Center
Niaz, Mansoor; Rodriguez, Maria A.
2005-01-01
Most general chemistry textbooks consider the oil drop experiment as a classic experiment, characterized by its simplicity and precise results. A review of the history and philosophy of science literature shows that the experiment is difficult to perform (even today!) and generated a considerable amount of controversy. Acceptance of the…
Precise FIA plot registration using field and dense LIDAR data
Demetrios Gatziolis
2009-01-01
Precise registration of forest inventory and analysis (FIA) plots is a prerequisite for an effective fusion of field data with ancillary spatial information, which is an approach commonly employed in the mapping of various forest parameters. Although the adoption of Global Positioning System technology has improved the precision of plot coordinates obtained during...
Refining FIA plot locations using LiDAR point clouds
Charlie Schrader-Patton; Greg C. Liknes; Demetrios Gatziolis; Brian M. Wing; Mark D. Nelson; Patrick D. Miles; Josh Bixby; Daniel G. Wendt; Dennis Kepler; Abbey Schaaf
2015-01-01
Forest Inventory and Analysis (FIA) plot location coordinate precision is often insufficient for use with high resolution remotely sensed data, thereby limiting the use of these plots for geospatial applications and reducing the validity of models that assume the locations are precise. A practical and efficient method is needed to improve coordinate precision. To...
Peer Assessment with Online Tools to Improve Student Modeling
ERIC Educational Resources Information Center
Atkins, Leslie J.
2012-01-01
Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to…
Application and principles of photon-doppler velocimetry for explosives testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briggs, Matthew Ellsworth; Hill, Larry; Hull, Larry
2010-01-01
The velocimetry technique PDV is easier to field than its predecessors VISAR and Fabry-Perot, works on a broader variety of experiments, and is more accurate and simple to analyze. Experiments and analysis have now demonstrated the accuracy, precision and interpretation of what PDV does and does not measure, and the successful application of POV to basic and applied detonation problems. We present a selection of results intended to help workers assess the capabilities of PDV. First we present general considerations about the technique: various PDV configurations, single-signal, multisignal (e.g., triature) and frequency-shifted PDV; what types of motion are sensed andmore » missed by PDV; analysis schemes for velocity and position extraction; accuracy and precision of the results; and, experimental considerations for probe selection and positioning. We then present the status of various applications: detonation speeds and wall motion in cylinder tests, breakout velocity distributions from bare HE, ejecta, measurements from fibers embedded in HE, projectile velocity, resolving 2 and 3-D velocity vectors. This paper is an overview of work done by many groups around the world.« less
Moye, Jennifer; Azar, Annin R.; Karel, Michele J.; Gurrera, Ronald J.
2016-01-01
Does instrument based evaluation of consent capacity increase the precision and validity of competency assessment or does ostensible precision provide a false sense of confidence without in fact improving validity? In this paper we critically examine the evidence for construct validity of three instruments for measuring four functional abilities important in consent capacity: understanding, appreciation, reasoning, and expressing a choice. Instrument based assessment of these abilities is compared through investigation of a multi-trait multi-method matrix in 88 older adults with mild to moderate dementia. Results find variable support for validity. There appears to be strong evidence for good hetero-method validity for the measurement of understanding, mixed evidence for validity in the measurement of reasoning, and strong evidence for poor hetero-method validity for the concepts of appreciation and expressing a choice, although the latter is likely due to extreme range restrictions. The development of empirically based tools for use in capacity evaluation should ultimately enhance the reliability and validity of assessment, yet clearly more research is needed to define and measure the constructs of decisional capacity. We would also emphasize that instrument based assessment of capacity is only one part of a comprehensive evaluation of competency which includes consideration of diagnosis, psychiatric and/or cognitive symptomatology, risk involved in the situation, and individual and cultural differences. PMID:27330455
Keller, Martina; Gutjahr, Christoph; Möhring, Jens; Weis, Martin; Sökefeld, Markus; Gerhards, Roland
2014-02-01
Precision experimental design uses the natural heterogeneity of agricultural fields and combines sensor technology with linear mixed models to estimate the effect of weeds, soil properties and herbicide on yield. These estimates can be used to derive economic thresholds. Three field trials are presented using the precision experimental design in winter wheat. Weed densities were determined by manual sampling and bi-spectral cameras, yield and soil properties were mapped. Galium aparine, other broad-leaved weeds and Alopecurus myosuroides reduced yield by 17.5, 1.2 and 12.4 kg ha(-1) plant(-1) m(2) in one trial. The determined thresholds for site-specific weed control with independently applied herbicides were 4, 48 and 12 plants m(-2), respectively. Spring drought reduced yield effects of weeds considerably in one trial, since water became yield limiting. A negative herbicide effect on the crop was negligible, except in one trial, in which the herbicide mixture tended to reduce yield by 0.6 t ha(-1). Bi-spectral cameras for weed counting were of limited use and still need improvement. Nevertheless, large weed patches were correctly identified. The current paper presents a new approach to conducting field trials and deriving decision rules for weed control in farmers' fields. © 2013 Society of Chemical Industry.
Philip, Benjamin A; Frey, Scott H
2016-07-01
Chronic forced use of the non-dominant left hand yields substantial improvements in the precision and quality of writing and drawing. These changes may arise from increased access by the non-dominant (right) hemisphere to dominant (left) hemisphere mechanisms specialized for end-point precision control. To evaluate this prediction, 22 healthy right-handed adults underwent resting state functional connectivity (FC) MRI scans before and after 10 days of training on a left hand precision drawing task. 89% of participants significantly improved left hand speed, accuracy, and smoothness. Smoothness gains were specific to the trained left hand and persistent: 6 months after training, 71% of participants exhibited above-baseline movement smoothness. Contrary to expectations, we found no evidence of increased FC between right and left hemisphere hand areas. Instead, training-related improvements in left hand movement smoothness were associated with increased FC between both sensorimotor hand areas and a left-lateralized parieto-prefrontal network implicated in manual praxis. By contrast, skill retention at 6 months was predicted by changes including decreased FC between the representation of the trained left hand and bilateral sensorimotor, parietal, and premotor cortices, possibly reflecting consolidation and a disengagement of early learning processes. These data indicate that modest amounts of training (<200min total) can induce substantial, persistent improvements the precision and quality of non-dominant hand control in healthy adults, supported by strengthened connectivity between bilateral sensorimotor hand areas and a left-lateralized parieto-prefrontal praxis network. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mohan, Shalini V; Chang, Anne Lynn S
2014-06-01
Precision medicine and precision therapeutics is currently in its infancy with tremendous potential to improve patient care by better identifying individuals at risk for skin cancer and predict tumor responses to treatment. This review focuses on the Hedgehog signaling pathway, its critical role in the pathogenesis of basal cell carcinoma, and the emergence of targeted treatments for advanced basal cell carcinoma. Opportunities to utilize precision medicine are outlined, such as molecular profiling to predict basal cell carcinoma response to targeted therapy and to inform therapeutic decisions.
Attaining the Photometric Precision Required by Future Dark Energy Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stubbs, Christopher
2013-01-21
This report outlines our progress towards achieving the high-precision astronomical measurements needed to derive improved constraints on the nature of the Dark Energy. Our approach to obtaining higher precision flux measurements has two basic components: 1) determination of the optical transmission of the atmosphere, and 2) mapping out the instrumental photon sensitivity function vs. wavelength, calibrated by referencing the measurements to the known sensitivity curve of a high precision silicon photodiode, and 3) using the self-consistency of the spectrum of stars to achieve precise color calibrations.
Evaluation of the EGNOS service for topographic profiling in field geosciences
NASA Astrophysics Data System (ADS)
Kromuszczyńska, Olga; Mège, Daniel; Castaldo, Luigi; Gurgurewicz, Joanna; Makowska, Magdalena; Dębniak, Krzysztof; Jelínek, Róbert
2016-09-01
Consumer grade Global Positioning System (GPS) receivers are commonly used as a tool for data collection in many fields, including geosciences. One of the methods for improving the GPS signal is provided by the Wide Area Differential GPS (WADGPS), which uses geostationary satellites to correct errors affecting the signal in real time. This study presents results of three experiments aiming at determining whether the precision of field measurements made by such a receiver (Garmin GPSMAP 62s) operating in either the non-differential and the WADGPS differential mode is suitable for characterizing geomorphological objects or landforms. It assumes in a typical field work situation, when time cannot be devoted in the field to long periods of stationary GPS measurements and the precision of topographic profile is at least as important as, if not more than, positioning of individual points. The results show that with maintaining some rules, the expected precision may meet the nominal precision. The repeatability (coherence) of topographic profiles conducted at low speed (0.5 m s- 1) in mountain terrain is good, and vertical precision is improved in the WADGPS mode. Horizontal precision is equivalent in both modes. The GPS receiver should be operating at least 30 min prior to measuring and should not be turned off between measurements that the user like to compare. If the GPS receiver needs to be reset between profiles to be compared, the measurement precision is higher in the non-differential GPS mode. Following these rules may result in improvement of measurement quality by 20% to 80%.
Wu, Jun; Hu, Xie-he; Chen, Sheng; Chu, Jian
2003-01-01
The closed-loop stability issue of finite-precision realizations was investigated for digital controllers implemented in block-floating-point format. The controller coefficient perturbation was analyzed resulting from using finite word length (FWL) block-floating-point representation scheme. A block-floating-point FWL closed-loop stability measure was derived which considers both the dynamic range and precision. To facilitate the design of optimal finite-precision controller realizations, a computationally tractable block-floating-point FWL closed-loop stability measure was then introduced and the method of computing the value of this measure for a given controller realization was developed. The optimal controller realization is defined as the solution that maximizes the corresponding measure, and a numerical optimization approach was adopted to solve the resulting optimal realization problem. A numerical example was used to illustrate the design procedure and to compare the optimal controller realization with the initial realization.
Spatial attention improves the quality of population codes in human visual cortex.
Saproo, Sameer; Serences, John T
2010-08-01
Selective attention enables sensory input from behaviorally relevant stimuli to be processed in greater detail, so that these stimuli can more accurately influence thoughts, actions, and future goals. Attention has been shown to modulate the spiking activity of single feature-selective neurons that encode basic stimulus properties (color, orientation, etc.). However, the combined output from many such neurons is required to form stable representations of relevant objects and little empirical work has formally investigated the relationship between attentional modulations on population responses and improvements in encoding precision. Here, we used functional MRI and voxel-based feature tuning functions to show that spatial attention induces a multiplicative scaling in orientation-selective population response profiles in early visual cortex. In turn, this multiplicative scaling correlates with an improvement in encoding precision, as evidenced by a concurrent increase in the mutual information between population responses and the orientation of attended stimuli. These data therefore demonstrate how multiplicative scaling of neural responses provides at least one mechanism by which spatial attention may improve the encoding precision of population codes. Increased encoding precision in early visual areas may then enhance the speed and accuracy of perceptual decisions computed by higher-order neural mechanisms.
Automation of Precise Time Reference Stations (PTRS)
NASA Astrophysics Data System (ADS)
Wheeler, P. J.
1985-04-01
The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.
Singlet-catalyzed electroweak phase transitions and precision Higgs boson studies
NASA Astrophysics Data System (ADS)
Profumo, Stefano; Ramsey-Musolf, Michael J.; Wainwright, Carroll L.; Winslow, Peter
2015-02-01
We update the phenomenology of gauge-singlet extensions of the Standard Model scalar sector and their implications for the electroweak phase transition. Considering the introduction of one real scalar singlet to the scalar potential, we analyze present constraints on the potential parameters from Higgs coupling measurements at the Large Hadron Collider (LHC) and electroweak precision observables for the kinematic regime in which no new scalar decay modes arise. We then show how future precision measurements of Higgs boson signal strengths and the Higgs self-coupling could probe the scalar potential parameter space associated with a strong first-order electroweak phase transition. We illustrate using benchmark precision for several future collider options, including the high-luminosity LHC, the International Linear Collider, Triple-Large Electron-Positron collider, the China Electron-Positron Collider, and a 100 TeV proton-proton collider, such as the Very High Energy LHC or the Super Proton-Proton Collider. For the regions of parameter space leading to a strong first-order electroweak phase transition, we find that there exists considerable potential for observable deviations from purely Standard Model Higgs properties at these prospective future colliders.
Improving the frequency precision of oscillators by synchronization.
Cross, M C
2012-04-01
Improving the frequency precision by synchronizing a lattice of N oscillators with disparate frequencies is studied in the phase reduction limit. In the general case where the coupling is not purely dissipative the synchronized state consists of targetlike waves radiating from a local source, which is a region of higher-frequency oscillators. In this state the improvement of the frequency precision is shown to be independent of N for large N, but instead depends on the disorder and reflects the dependence of the frequency of the synchronized state on just those oscillators in the source region of the waves. These results are obtained by a mapping of the nonlinear phase dynamics onto the linear Anderson problem of the quantum mechanics of electrons on a random lattice in the tight-binding approximation.
Independence of motor unit recruitment and rate modulation during precision force control.
Kamen, G; Du, D C
1999-01-01
The vertebrate motor system chiefly employs motor unit recruitment and rate coding to modulate muscle force output. In this paper, we studied how the recruitment of new motor units altered the firing rate of already-active motor units during precision force production in the first dorsal interosseous muscle. Six healthy adults performed linearly increasing isometric voluntary contractions while motor unit activity and force output were recorded. After motor unit discharges were identified, motor unit firing rates were calculated before and after the instances of new motor unit recruitment. Three procedures were applied to compute motor unit firing rate, including the mean of a fixed number of inter-spike intervals and the constant width weighted Hanning window filter method, as well as a modified boxcar technique. In contrast to previous reports, the analysis of the firing rates of over 200 motor units revealed that reduction of the active firing rates was not a common mechanism used to accommodate the twitch force produced by the recruitment of a new motor unit. Similarly, during de-recruitment there was no tendency for motor unit firing rates to increase immediately following the cessation of activity in other motor units. Considerable consistency in recruitment behavior was observed during repeated contractions. However, firing rates during repeated contractions demonstrated considerably more fluctuation. It is concluded that the neuromuscular system does not use short-term preferential motor unit disfacilitation to effect precise regulation of muscular force output.
NASA Astrophysics Data System (ADS)
Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang
2018-01-01
Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm for BeiDou GEO satellites. The real-time positioning results prove that the GPS + BeiDou + Galileo RT-PPP comparing to GPS-only can effectively accelerate convergence time by about 60%, improve the positioning accuracy by about 30% and obtain averaged RMS 4 cm in horizontal and 6 cm in vertical; additionally RT-SPP accuracy in the prototype system can realize positioning accuracy with about averaged RMS 1 m in horizontal and 1.5-2 m in vertical, which are improved by 60% and 70% to SPP based on broadcast ephemeris, respectively.
Melching, C.S.; Coupe, R.H.
1995-01-01
During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.
Dal Pra, Alan; Locke, Jennifer A.; Borst, Gerben; Supiot, Stephane; Bristow, Robert G.
2016-01-01
Radiation therapy (RT) is one of the mainstay treatments for prostate cancer (PCa). The potentially curative approaches can provide satisfactory results for many patients with non-metastatic PCa; however, a considerable number of individuals may present disease recurrence and die from the disease. Exploiting the rich molecular biology of PCa will provide insights into how the most resistant tumor cells can be eradicated to improve treatment outcomes. Important for this biology-driven individualized treatment is a robust selection procedure. The development of predictive biomarkers for RT efficacy is therefore of utmost importance for a clinically exploitable strategy to achieve tumor-specific radiosensitization. This review highlights the current status and possible opportunities in the modulation of four key processes to enhance radiation response in PCa by targeting the: (1) androgen signaling pathway; (2) hypoxic tumor cells and regions; (3) DNA damage response (DDR) pathway; and (4) abnormal extra-/intracell signaling pathways. In addition, we discuss how and which patients should be selected for biomarker-based clinical trials exploiting and validating these targeted treatment strategies with precision RT to improve cure rates in non-indolent, localized PCa. PMID:26909338
Khan, Tahira; Stewart, Mark; Blackman, Samuel; Rousseau, Raphaël; Donoghue, Martha; Cohen, Kenneth; Seibel, Nita; Fleury, Mark; Benettaib, Bouchra; Malik, Raleigh; Vassal, Gilles; Reaman, Gregory
2018-01-01
Although outcomes for children with cancer have significantly improved over the past 40 years, there has been little progress in the treatment of some pediatric cancers, particularly when advanced. Additionally, clinical trial options and availability are often insufficient. Improved genomic and immunologic understanding of pediatric cancers, combined with innovative clinical trial designs, may provide an enhanced opportunity to study childhood cancers. Master protocols, which incorporate the use of precision medicine approaches, coupled with the ability to quickly assess the safety and effectiveness of new therapies, have the potential to accelerate early-phase clinical testing of novel therapeutics and which may result in more rapid approval of new drugs for children with cancer. Designing and conducting master protocols for children requires addressing similar principles and requirements as traditional adult oncology trials, but there are also unique considerations for master protocols conducted in children with cancer. The purpose of this paper is to define the key challenges and opportunities associated with this approach in order to ensure that master protocols can be adapted to benefit children and adolescents and ensure that adequate data are captured to advance, in parallel, the clinical development of investigational agents for children with cancer.
NASA Astrophysics Data System (ADS)
Shim, Hackjoon; Lee, Soochan; Kim, Bohyeong; Tao, Cheng; Chang, Samuel; Yun, Il Dong; Lee, Sang Uk; Kwoh, Kent; Bae, Kyongtae
2008-03-01
Knee osteoarthritis is the most common debilitating health condition affecting elderly population. MR imaging of the knee is highly sensitive for diagnosis and evaluation of the extent of knee osteoarthritis. Quantitative analysis of the progression of osteoarthritis is commonly based on segmentation and measurement of articular cartilage from knee MR images. Segmentation of the knee articular cartilage, however, is extremely laborious and technically demanding, because the cartilage is of complex geometry and thin and small in size. To improve precision and efficiency of the segmentation of the cartilage, we have applied a semi-automated segmentation method that is based on an s/t graph cut algorithm. The cost function was defined integrating regional and boundary cues. While regional cues can encode any intensity distributions of two regions, "object" (cartilage) and "background" (the rest), boundary cues are based on the intensity differences between neighboring pixels. For three-dimensional (3-D) segmentation, hard constraints are also specified in 3-D way facilitating user interaction. When our proposed semi-automated method was tested on clinical patients' MR images (160 slices, 0.7 mm slice thickness), a considerable amount of segmentation time was saved with improved efficiency, compared to a manual segmentation approach.
Pützer, Manfred; Barry, William J; Moringlane, Jean Richard
2008-12-01
The effect of deep brain stimulation on the two speech-production subsystems, articulation and phonation, of nine Parkinsonian patients is examined. Production parameters (stop closure voicing; stop closure, VOT, vowel) in fast syllable-repetitions were defined and measured and quantitative, objective metrics of vocal fold function were obtained during vowel production. Speech material was recorded for patients (with and without stimulation) and for a reference group of healthy control speakers. With stimulation, precision of the glottal and supraglottal articulation as well as the phonatory function is reduced for some individuals, whereas for other individuals an improvement is observed. Importantly, the improvement or deterioration is determined not only on the basis of the direction of parameter change but also on the individuals' position relative to the healthy control data. This study also notes differences within an individual in the effects of stimulation on the two speech subsystems. These findings qualify the value of global statements about the effect of neurostimulatory operations on Parkinsonian patients. They also underline the importance of careful consideration of individual differences in the effect of deep brain stimulation on different speech subsystems.
Brozek-Pluska, Beata; Jarota, Arkadiusz; Jablonska-Gajewicz, Joanna; Kordek, Radzislaw; Czajkowski, Wojciech; Abramczyk, Halina
2012-08-01
There is a considerable interest in the developing new diagnostic techniques allowing noninvasive tracking of the progress of therapies used to treat a cancer. Raman imaging of distribution of phthalocyanine photosensitizers may open new possibilities of Photodynamic Therapy (PDT) to treat a wide range of neoplastic lesions with improved effectiveness of treatment through precise identification of malignant areas. We have employed Raman imaging and Raman spectroscopy to analyze human breast cancer tissue that interacts with photosensitizers used in the photodynamic therapy of cancer. PCA (Principal Component Analysis) has been employed to analyze various areas of the noncancerous and cancerous breast tissues. The results show that the emission spectra combined with the Raman images are very sensitive indicators to specify the aggregation state and the distribution of phthalocyanines in the cancerous and noncancerous breast tissues. Our results provide experimental evidence on the role of aggregation of phthalocyanines as a factor of particular significance in differentiation of the normal and tumourous (cancerous or benign pathology) breast tissues. We conclude that the Raman imaging reported here has a potential to be a novel and effective photodynamic therapeutic method with improved selectivity for the treatment of breast cancer.
Blynn, Emily; Ahmed, Saifuddin; Gibson, Dustin; Pariyo, George; Hyder, Adnan A
2017-01-01
In low- and middle-income countries (LMICs), historically, household surveys have been carried out by face-to-face interviews to collect survey data related to risk factors for noncommunicable diseases. The proliferation of mobile phone ownership and the access it provides in these countries offers a new opportunity to remotely conduct surveys with increased efficiency and reduced cost. However, the near-ubiquitous ownership of phones, high population mobility, and low cost require a re-examination of statistical recommendations for mobile phone surveys (MPS), especially when surveys are automated. As with landline surveys, random digit dialing remains the most appropriate approach to develop an ideal survey-sampling frame. Once the survey is complete, poststratification weights are generally applied to reduce estimate bias and to adjust for selectivity due to mobile ownership. Since weights increase design effects and reduce sampling efficiency, we introduce the concept of automated active strata monitoring to improve representativeness of the sample distribution to that of the source population. Although some statistical challenges remain, MPS represent a promising emerging means for population-level data collection in LMICs. PMID:28476726
Improved Controller for a Three-Axis Piezoelectric Stage
NASA Technical Reports Server (NTRS)
Rao, Shanti; Palmer, Dean
2009-01-01
An improved closed-loop controller has been built for a three-axis piezoelectric positioning stage. The stage can be any of a number of commercially available or custom-made units that are used for precise three-axis positioning of optics in astronomical instruments and could be used for precise positioning in diverse fields of endeavor that include adaptive optics, fabrication of semiconductors, and nanotechnology.
Watanabe, Jun; Oki, Tomoyuki; Takebayashi, Jun; Yamasaki, Koji; Takano-Ishikawa, Yuko; Hino, Akihiro; Yasui, Akemi
2013-01-01
We improved the procedure for lipophilic-oxygen radical absorbance capacity (L-ORAC) measurement for better repeatability and intermediate precision. A sealing film was placed on the assay plate, and glass vials and microdispensers equipped with glass capillaries were used. The antioxidant capacities of food extracts can be evaluated by this method with nearly the same precision as antioxidant solutions.
Experimental Study on the Precise Orbit Determination of the BeiDou Navigation Satellite System
He, Lina; Ge, Maorong; Wang, Jiexian; Wickert, Jens; Schuh, Harald
2013-01-01
The regional service of the Chinese BeiDou satellite navigation system is now in operation with a constellation including five Geostationary Earth Orbit satellites (GEO), five Inclined Geosynchronous Orbit (IGSO) satellites and four Medium Earth Orbit (MEO) satellites. Besides the standard positioning service with positioning accuracy of about 10 m, both precise relative positioning and precise point positioning are already demonstrated. As is well known, precise orbit and clock determination is essential in enhancing precise positioning services. To improve the satellite orbits of the BeiDou regional system, we concentrate on the impact of the tracking geometry and the involvement of MEOs, and on the effect of integer ambiguity resolution as well. About seven weeks of data collected at the BeiDou Experimental Test Service (BETS) network is employed in this experimental study. Several tracking scenarios are defined, various processing schemata are designed and carried out; and then, the estimates are compared and analyzed in detail. The results show that GEO orbits, especially the along-track component, can be significantly improved by extending the tracking network in China along longitude direction, whereas IGSOs gain more improvement if the tracking network extends in latitude. The involvement of MEOs and ambiguity-fixing also make the orbits better. PMID:23529116
Active transport improves the precision of linear long distance molecular signalling
NASA Astrophysics Data System (ADS)
Godec, Aljaž; Metzler, Ralf
2016-09-01
Molecular signalling in living cells occurs at low copy numbers and is thereby inherently limited by the noise imposed by thermal diffusion. The precision at which biochemical receptors can count signalling molecules is intimately related to the noise correlation time. In addition to passive thermal diffusion, messenger RNA and vesicle-engulfed signalling molecules can transiently bind to molecular motors and are actively transported across biological cells. Active transport is most beneficial when trafficking occurs over large distances, for instance up to the order of 1 metre in neurons. Here we explain how intermittent active transport allows for faster equilibration upon a change in concentration triggered by biochemical stimuli. Moreover, we show how intermittent active excursions induce qualitative changes in the noise in effectively one-dimensional systems such as dendrites. Thereby they allow for significantly improved signalling precision in the sense of a smaller relative deviation in the concentration read-out by the receptor. On the basis of linear response theory we derive the exact mean field precision limit for counting actively transported molecules. We explain how intermittent active excursions disrupt the recurrence in the molecular motion, thereby facilitating improved signalling accuracy. Our results provide a deeper understanding of how recurrence affects molecular signalling precision in biological cells and novel medical-diagnostic devices.
Experimental study on the precise orbit determination of the BeiDou navigation satellite system.
He, Lina; Ge, Maorong; Wang, Jiexian; Wickert, Jens; Schuh, Harald
2013-03-01
The regional service of the Chinese BeiDou satellite navigation system is now in operation with a constellation including five Geostationary Earth Orbit satellites (GEO), five Inclined Geosynchronous Orbit (IGSO) satellites and four Medium Earth Orbit (MEO) satellites. Besides the standard positioning service with positioning accuracy of about 10 m, both precise relative positioning and precise point positioning are already demonstrated. As is well known, precise orbit and clock determination is essential in enhancing precise positioning services. To improve the satellite orbits of the BeiDou regional system, we concentrate on the impact of the tracking geometry and the involvement of MEOs, and on the effect of integer ambiguity resolution as well. About seven weeks of data collected at the BeiDou Experimental Test Service (BETS) network is employed in this experimental study. Several tracking scenarios are defined, various processing schemata are designed and carried out; and then, the estimates are compared and analyzed in detail. The results show that GEO orbits, especially the along-track component, can be significantly improved by extending the tracking network in China along longitude direction, whereas IGSOs gain more improvement if the tracking network extends in latitude. The involvement of MEOs and ambiguity-fixing also make the orbits better.
Toward 1-mm depth precision with a solid state full-field range imaging system
NASA Astrophysics Data System (ADS)
Dorrington, Adrian A.; Carnegie, Dale A.; Cree, Michael J.
2006-02-01
Previously, we demonstrated a novel heterodyne based solid-state full-field range-finding imaging system. This system is comprised of modulated LED illumination, a modulated image intensifier, and a digital video camera. A 10 MHz drive is provided with 1 Hz difference between the LEDs and image intensifier. A sequence of images of the resulting beating intensifier output are captured and processed to determine phase and hence distance to the object for each pixel. In a previous publication, we detailed results showing a one-sigma precision of 15 mm to 30 mm (depending on signal strength). Furthermore, we identified the limitations of the system and potential improvements that were expected to result in a range precision in the order of 1 mm. These primarily include increasing the operating frequency and improving optical coupling and sensitivity. In this paper, we report on the implementation of these improvements and the new system characteristics. We also comment on the factors that are important for high precision image ranging and present configuration strategies for best performance. Ranging with sub-millimeter precision is demonstrated by imaging a planar surface and calculating the deviations from a planar fit. The results are also illustrated graphically by imaging a garden gnome.
This illustration represents the National Cancer Institute’s support of research to improve precision medicine in cancer treatment, in which unique therapies treat an individual’s cancer based on specific genetic abnormalities of that person’s tumor.
The advancement of the high precision stress polishing
NASA Astrophysics Data System (ADS)
Li, Chaoqiang; Lei, Baiping; Han, Yu
2016-10-01
The stress polishing is a kind of large-diameter aspheric machining technology with high efficiency. This paper focuses on the principle, application in the processing of large aspheric mirror, and the domestic and foreign research status of stress polishing, aimed at the problem of insufficient precision of mirror surface deformation calculated by some traditional theories and the problem that the output precision and stability of the support device in stress polishing cannot meet the requirements. The improvement methods from these three aspects are put forward, the characterization method of mirror's elastic deformation in stress polishing, the deformation theory of influence function and the calculation of correction force, the design of actuator's mechanical structure. These improve the precision of stress polishing and provide theoretical basis for the further application of stress polishing in large-diameter aspheric machining.
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
Principles of Precision Prevention Science for Improving Recruitment and Retention of Participants.
Supplee, Lauren H; Parekh, Jenita; Johnson, Makedah
2018-03-12
Precision medicine and precision public health focus on identifying and providing the right intervention to the right population at the right time. Expanding on the concept, precision prevention science could allow the field to examine prevention programs to identify ways to make them more efficient and effective at scale, including addressing issues related to engagement and retention of participants. Research to date on engagement and retention has often focused on demographics and risk factors. The current paper proposes using McCurdy and Daro (Family Relations, 50, 113-121, 2001) model that posits a complex mixture of individual, provider, program, and community-level factors synergistically affect enrollment, engagement, and retention. The paper concludes recommending the use of research-practice partnerships and innovative, rapid cycle methods to design and improve prevention programs related to participant engagement and retention at scale.
Negri, Lucas; Nied, Ademir; Kalinowski, Hypolito; Paterno, Aleksander
2011-01-01
This paper presents a benchmark for peak detection algorithms employed in fiber Bragg grating spectrometric interrogation systems. The accuracy, precision, and computational performance of currently used algorithms and those of a new proposed artificial neural network algorithm are compared. Centroid and gaussian fitting algorithms are shown to have the highest precision but produce systematic errors that depend on the FBG refractive index modulation profile. The proposed neural network displays relatively good precision with reduced systematic errors and improved computational performance when compared to other networks. Additionally, suitable algorithms may be chosen with the general guidelines presented. PMID:22163806
Gómez-Ordóñez, Eva; Jiménez-Escrig, Antonio; Rupérez, Pilar
2012-05-15
Biological properties of polysaccharides from seaweeds are related to their composition and structure. Many factors such as the kind of sugar, type of linkage or sulfate content of algal biopolymers exert an influence in the relationship between structure and function. Besides, the molecular weight (MW) also plays an important role. Thus, a simple, reliable and fast HPSEC method with refractive index detection was developed and optimized for the MW estimation of soluble algal polysaccharides. Chromatogram shape and repeatability of retention time was considerably improved when sodium nitrate was used instead of ultrapure water as mobile phase. Pullulan and dextran standards of different MW were used for method calibration and validation. Also, main polysaccharide standards from brown (alginate, fucoidan, laminaran) and red seaweeds (kappa- and iota-carrageenan) were used for quantification and method precision and accuracy. Relative standard deviation (RSD) of repeatability for retention time, peak areas and inter-day precision was below 0.7%, 2.5% and 2.6%, respectively, which indicated good repeatability and precision. Recoveries (96.3-109.8%) also showed its fairly good accuracy. Regarding linearity, main polysaccharide standards from brown or red seaweeds showed a highly satisfactory correlation coefficient (r>0.999). Moreover, a good sensitivity was shown, with corresponding limits of detection and quantitation in mg/mL of 0.05-0.21 and 0.16-0.31, respectively. The method was applied to the MW estimation of standard algal polysaccharides, as well as to the soluble polysaccharide fractions from the brown seaweed Saccharina latissima and the red Mastocarpus stellatus, respectively. Although distribution of molecular weight was broad, the good repeatability for retention time provided a good precision in MW estimation of polysaccharides. Water- and alkali-soluble fractions from S. latissima ranged from very high (>2400 kDa) to low MW compounds (<6 kDa); this high heterogeneity could be attributable to the complex polysaccharide composition of brown algae. Regarding M. stellatus, sulfated galactans followed a descending order of MW (>1400 kDa to <10 kDa), related to the different solubility of carrageenans in red seaweeds. In summary, the method developed allows for the molecular weight analysis of seaweed polysaccharides with very good precision, accuracy, linearity and sensitivity within a short time. Copyright © 2012 Elsevier B.V. All rights reserved.
Automated Semantic Indexing of Figure Captions to Improve Radiology Image Retrieval
Kahn, Charles E.; Rubin, Daniel L.
2009-01-01
Objective We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. Design The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Measurements Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Results Estimated precision was 0.897 (95% confidence interval, 0.857–0.937). Estimated recall was 0.930 (95% confidence interval, 0.838–1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Conclusion Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval. PMID:19261938
NASA Technical Reports Server (NTRS)
Dutra, Jayne E.; Smith, Lisa
2006-01-01
The goal of this plan is to briefly describe new technologies available to us in the arenas of information discovery and discuss the strategic value they have for the NASA enterprise with some considerations and suggestions for near term implementations using the NASA Engineering Network (NEN) as a delivery venue.
New trends in metal forming in the USA
NASA Astrophysics Data System (ADS)
1982-05-01
The use of lasers in sheet metal stamping, hydraulic presses, cold pressing, and deformation of titanium alloy to produce components which do not require subsequent machining are discussed. Superplastic deformation techniques could lead to cost savings of 90% in the aerospace industry. Precision forging and welding technologies can considerably reduce raw material costs, but investment costs are high.
Recovery in Context: Bereavement, Culture, and the Transformation of the Therapeutic Self
ERIC Educational Resources Information Center
Paletti, Robin
2008-01-01
The concept of recovery is integral to the work of bereavement professionals, though there remains considerable debate as to the precise definition and applicability of the term. A number of factors have contributed to the controversy; competing bereavement models, for instance, have given rise to fundamental differences in the way that recovery…
Al-Chokhachy, R.; Budy, P.; Conner, M.
2009-01-01
Using empirical field data for bull trout (Salvelinus confluentus), we evaluated the trade-off between power and sampling effort-cost using Monte Carlo simulations of commonly collected mark-recapture-resight and count data, and we estimated the power to detect changes in abundance across different time intervals. We also evaluated the effects of monitoring different components of a population and stratification methods on the precision of each method. Our results illustrate substantial variability in the relative precision, cost, and information gained from each approach. While grouping estimates by age or stage class substantially increased the precision of estimates, spatial stratification of sampling units resulted in limited increases in precision. Although mark-resight methods allowed for estimates of abundance versus indices of abundance, our results suggest snorkel surveys may be a more affordable monitoring approach across large spatial scales. Detecting a 25% decline in abundance after 5 years was not possible, regardless of technique (power = 0.80), without high sampling effort (48% of study site). Detecting a 25% decline was possible after 15 years, but still required high sampling efforts. Our results suggest detecting moderate changes in abundance of freshwater salmonids requires considerable resource and temporal commitments and highlight the difficulties of using abundance measures for monitoring bull trout populations.
Rasmussen, Sebastian R; Konge, Lars; Mikkelsen, Peter T; Sørensen, Mads S; Andersen, Steven A W
2016-03-01
Cognitive load (CL) theory suggests that working memory can be overloaded in complex learning tasks such as surgical technical skills training, which can impair learning. Valid and feasible methods for estimating the CL in specific learning contexts are necessary before the efficacy of CL-lowering instructional interventions can be established. This study aims to explore secondary task precision for the estimation of CL in virtual reality (VR) surgical simulation and also investigate the effects of CL-modifying factors such as simulator-integrated tutoring and repeated practice. Twenty-four participants were randomized for visual assistance by a simulator-integrated tutor function during the first 5 of 12 repeated mastoidectomy procedures on a VR temporal bone simulator. Secondary task precision was found to be significantly lower during simulation compared with nonsimulation baseline, p < .001. Contrary to expectations, simulator-integrated tutoring and repeated practice did not have an impact on secondary task precision. This finding suggests that even though considerable changes in CL are reflected in secondary task precision, it lacks sensitivity. In contrast, secondary task reaction time could be more sensitive, but requires substantial postprocessing of data. Therefore, future studies on the effect of CL modifying interventions should weigh the pros and cons of the various secondary task measurements. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Hou, Yanqing; Verhagen, Sandra; Wu, Jie
2016-12-01
Ambiguity Resolution (AR) is a key technique in GNSS precise positioning. In case of weak models (i.e., low precision of data), however, the success rate of AR may be low, which may consequently introduce large errors to the baseline solution in cases of wrong fixing. Partial Ambiguity Resolution (PAR) is therefore proposed such that the baseline precision can be improved by fixing only a subset of ambiguities with high success rate. This contribution proposes a new PAR strategy, allowing to select the subset such that the expected precision gain is maximized among a set of pre-selected subsets, while at the same time the failure rate is controlled. These pre-selected subsets are supposed to obtain the highest success rate among those with the same subset size. The strategy is called Two-step Success Rate Criterion (TSRC) as it will first try to fix a relatively large subset with the fixed failure rate ratio test (FFRT) to decide on acceptance or rejection. In case of rejection, a smaller subset will be fixed and validated by the ratio test so as to fulfill the overall failure rate criterion. It is shown how the method can be practically used, without introducing a large additional computation effort. And more importantly, how it can improve (or at least not deteriorate) the availability in terms of baseline precision comparing to classical Success Rate Criterion (SRC) PAR strategy, based on a simulation validation. In the simulation validation, significant improvements are obtained for single-GNSS on short baselines with dual-frequency observations. For dual-constellation GNSS, the improvement for single-frequency observations on short baselines is very significant, on average 68%. For the medium- to long baselines, with dual-constellation GNSS the average improvement is around 20-30%.
NASA Astrophysics Data System (ADS)
Malina, L.; Coello de Portugal, J.; Persson, T.; Skowroński, P. K.; Tomás, R.; Franchi, A.; Liuzzo, S.
2017-08-01
Beam optics control is of critical importance for machine performance and protection. Nowadays, turn-by-turn (TbT) beam position monitor (BPM) data are increasingly exploited as they allow for fast and simultaneous measurement of various optics quantities. Nevertheless, so far the best documented uncertainty of measured β -functions is of about 10‰ rms. In this paper we compare the β -functions of the ESRF storage ring measured from two different TbT techniques—the N-BPM and the Amplitude methods—with the ones inferred from a measurement of the orbit response matrix (ORM). We show how to improve the precision of TbT techniques by refining the Fourier transform of TbT data with properly chosen excitation amplitude. The precision of the N-BPM method is further improved by refining the phase advance measurement. This represents a step forward compared to standard TbT measurements. First experimental results showing the precision of β -functions pushed down to 4‰ both in TbT and ORM techniques are reported and commented.
Precision Medicine and the Changing Landscape of Research Ethics.
Hammer, Marilyn J
2016-03-01
President Barack Obama announced the launch of the National Institutes of Health Precision Medicine Initiative® (PMI) in January 2015. Precision medicine includes the concept of individualized or personalized medicine at a more exact level through advances in science and technology, such as genetics and genomics sequencing. Although many disease processes will be investigated through the precision medicine lens for greater understanding and improved treatment responses, oncology research and translation to practice is leading the initiative's debut, referred to as the near-term focus.
Improved phase-ellipse method for in-situ geophone calibration.
Liu, Huaibao P.; Peselnick, L.
1986-01-01
For amplitude and phase response calibration of moving-coil electromagnetic geophones 2 parameters are needed, namely the geophone natural frequency, fo, and the geophone upper resonance frequency fu. The phase-ellipse method is commonly used for the in situ determination of these parameters. For a given signal-to-noise ratio, the precision of the measurement of fo and fu depends on the phase sensitivity, f(delta PHI/delta PHIf). For some commercial geophones (f(delta PHI/delta PHI) at fu can be an order of magnitude less than the sensitivity at fo. Presents an improved phase-ellipse method with increased precision. Compared to measurements made with the existing phase-ellipse methods, the method shows a 6- and 3-fold improvement in the precision, respectively, on measurements of fo and fu on a commercial geophone.-from Authors
NASA Technical Reports Server (NTRS)
Flock, W. L.
1981-01-01
When high precision is required for range measurement on Earth space paths, it is necessary to correct as accurately as possible for excess range delays due to the dry air, water vapor, and liquid water content of the atmosphere. Calculations based on representative values of atmospheric parameters are useful for illustrating the order of magnitude of the expected delays. Range delay, time delay, and phase delay are simply and directly related. Doppler frequency variations or noise are proportional to the time rate of change of excess range delay. Tropospheric effects were examined as part of an overall consideration of the capability of precision two way ranging and Doppler systems.
Radiometer uncertainty equation research of 2D planar scanning PMMW imaging system
NASA Astrophysics Data System (ADS)
Hu, Taiyang; Xu, Jianzhong; Xiao, Zelong
2009-07-01
With advances in millimeter-wave technology, passive millimeter-wave (PMMW) imaging technology has received considerable concerns, and it has established itself in a wide range of military and civil practical applications, such as in the areas of remote sensing, blind landing, precision guidance and security inspection. Both the high transparency of clothing at millimeter wavelengths and the spatial resolution required to generate adequate images combine to make imaging at millimeter wavelengths a natural approach of screening people for concealed contraband detection. And at the same time, the passive operation mode does not present a safety hazard to the person who is under inspection. Based on the description to the design and engineering implementation of a W-band two-dimensional (2D) planar scanning imaging system, a series of scanning methods utilized in PMMW imaging are generally compared and analyzed, followed by a discussion on the operational principle of the mode of 2D planar scanning particularly. Furthermore, it is found that the traditional radiometer uncertainty equation, which is derived from a moving platform, does not hold under this 2D planar scanning mode due to the fact that there is no absolute connection between the scanning rates in horizontal direction and vertical direction. Consequently, an improved radiometer uncertainty equation is carried out in this paper, by means of taking the total time spent on scanning and imaging into consideration, with the purpose of solving the problem mentioned above. In addition, the related factors which affect the quality of radiometric images are further investigated under the improved radiometer uncertainty equation, and ultimately some original results are presented and analyzed to demonstrate the significance and validity of this new methodology.
Collaborative Genomics Study Advances Precision Oncology
A collaborative study conducted by two Office of Cancer Genomics (OCG) initiatives highlights the importance of integrating structural and functional genomics programs to improve cancer therapies, and more specifically, contribute to precision oncology treatments for children.
Beyond bankable dollars: establishing a business case for improving health care.
Bailit, Michael; Dyer, Mary Beth
2004-09-01
To address widespread deficiencies in the quality of health care, the authors argue that health care organizations need to be able to make a "business case" for improving quality--a compelling rationale for financial investment in quality improvement programs. The authors' framework for such a business case is organized around three broad areas: direct financial considerations, strategic considerations, and internal organizational considerations. Within these categories, they offer a total of 10 specific business case arguments, with examples, for investing in quality improvement.
Compressed sensing system considerations for ECG and EMG wireless biosensors.
Dixon, Anna M R; Allstot, Emily G; Gangopadhyay, Daibashish; Allstot, David J
2012-04-01
Compressed sensing (CS) is an emerging signal processing paradigm that enables sub-Nyquist processing of sparse signals such as electrocardiogram (ECG) and electromyogram (EMG) biosignals. Consequently, it can be applied to biosignal acquisition systems to reduce the data rate to realize ultra-low-power performance. CS is compared to conventional and adaptive sampling techniques and several system-level design considerations are presented for CS acquisition systems including sparsity and compression limits, thresholding techniques, encoder bit-precision requirements, and signal recovery algorithms. Simulation studies show that compression factors greater than 16X are achievable for ECG and EMG signals with signal-to-quantization noise ratios greater than 60 dB.
The Role of Big Data in the Management of Sleep-Disordered Breathing.
Budhiraja, Rohit; Thomas, Robert; Kim, Matthew; Redline, Susan
2016-06-01
Analysis of large-volume data holds promise for improving the application of precision medicine to sleep, including improving identification of patient subgroups who may benefit from alternative therapies. Big data used within the health care system also promises to facilitate end-to-end screening, diagnosis, and management of sleep disorders; improve the recognition of differences in presentation and susceptibility to sleep apnea; and lead to improved management and outcomes. To meet the vision of personalized, precision therapeutics and diagnostics and improving the efficiency and quality of sleep medicine will require ongoing efforts, investments, and change in our current medical and research cultures. Copyright © 2016 Elsevier Inc. All rights reserved.
Donald B.K. English; Susan M. Kocis; J. Ross Arnold; Stanley J. Zarnoch; Larry Warren
2003-01-01
In estimating recreation visitation at the National Forest level in the US, annual counts of a number of types of visitation proxy measures were used. The intent was to improve the overall precision of the visitation estimate by employing the proxy counts. The precision of visitation estimates at sites that had proxy information versus those that did not is examined....
Blasimme, Alessandro; Vayena, Effy
2016-11-04
Precision medicine promises to develop diagnoses and treatments that take individual variability into account. According to most specialists, turning this promise into reality will require adapting the established framework of clinical research ethics, and paying more attention to participants' attitudes towards sharing genotypic, phenotypic, lifestyle data and health records, and ultimately to their desire to be engaged as active partners in medical research.Notions such as participation, engagement and partnership have been introduced in bioethics debates concerning genetics and large-scale biobanking to broaden the focus of discussion beyond individual choice and individuals' moral interests. The uptake of those concepts in precision medicine is to be welcomed. However, as data and medical information from research participants in precision medicine cohorts will be collected on an individual basis, translating a participatory approach in this emerging area may prove cumbersome. Therefore, drawing on Joseph Raz's perfectionism, we propose a principle of respect for autonomous agents that, we reckon, can address many of the concerns driving recent scholarship on partnership and public participation, while avoiding some of the limitations these concept have in the context of precision medicine. Our approach offers a normative clarification to how becoming partners in precision is compatible with retaining autonomy.Realigning the value of autonomy with ideals of direct engagement, we show, can provide adequate normative orientation to precision medicine; it can do justice to the idea of moral pluralism by stressing the value of moral self-determination: and, finally, it can reconcile the notion of autonomy with other more communitarian values such as participation and solidarity.
Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
NASA Astrophysics Data System (ADS)
You, Daekeun; Simpson, Matthew; Antani, Sameer; Demner-Fushman, Dina; Thoma, George R.
2013-01-01
Pointers (arrows and symbols) are frequently used in biomedical images to highlight specific image regions of interest (ROIs) that are mentioned in figure captions and/or text discussion. Detection of pointers is the first step toward extracting relevant visual features from ROIs and combining them with textual descriptions for a multimodal (text and image) biomedical article retrieval system. Recently we developed a pointer recognition algorithm based on an edge-based pointer segmentation method, and subsequently reported improvements made on our initial approach involving the use of Active Shape Models (ASM) for pointer recognition and region growing-based method for pointer segmentation. These methods contributed to improving the recall of pointer recognition but not much to the precision. The method discussed in this article is our recent effort to improve the precision rate. Evaluation performed on two datasets and compared with other pointer segmentation methods show significantly improved precision and the highest F1 score.
Real-time Kinematic Positioning of INS Tightly Aided Multi-GNSS Ionospheric Constrained PPP
Gao, Zhouzheng; Shen, Wenbin; Zhang, Hongping; Niu, Xiaoji; Ge, Maorong
2016-01-01
Real-time Precise Point Positioning (PPP) technique is being widely applied for providing precise positioning services with the significant improvement on satellite precise products accuracy. With the rapid development of the multi-constellation Global Navigation Satellite Systems (multi-GNSS), currently, about 80 navigation satellites are operational in orbit. Obviously, PPP performance is dramatically improved with all satellites compared to that of GPS-only PPP. However, the performance of PPP could be evidently affected by unexpected and unavoidable severe observing environments, especially in the dynamic applications. Consequently, we apply Inertial Navigation System (INS) to the Ionospheric-Constrained (IC) PPP to overcome such drawbacks. The INS tightly aided multi-GNSS IC-PPP model can make full use of GNSS and INS observations to improve the PPP performance in terms of accuracy, availability, continuity, and convergence speed. Then, a set of airborne data is analyzed to evaluate and validate the improvement of multi-GNSS and INS on the performance of IC-PPP. PMID:27470270
Real-time Kinematic Positioning of INS Tightly Aided Multi-GNSS Ionospheric Constrained PPP.
Gao, Zhouzheng; Shen, Wenbin; Zhang, Hongping; Niu, Xiaoji; Ge, Maorong
2016-07-29
Real-time Precise Point Positioning (PPP) technique is being widely applied for providing precise positioning services with the significant improvement on satellite precise products accuracy. With the rapid development of the multi-constellation Global Navigation Satellite Systems (multi-GNSS), currently, about 80 navigation satellites are operational in orbit. Obviously, PPP performance is dramatically improved with all satellites compared to that of GPS-only PPP. However, the performance of PPP could be evidently affected by unexpected and unavoidable severe observing environments, especially in the dynamic applications. Consequently, we apply Inertial Navigation System (INS) to the Ionospheric-Constrained (IC) PPP to overcome such drawbacks. The INS tightly aided multi-GNSS IC-PPP model can make full use of GNSS and INS observations to improve the PPP performance in terms of accuracy, availability, continuity, and convergence speed. Then, a set of airborne data is analyzed to evaluate and validate the improvement of multi-GNSS and INS on the performance of IC-PPP.
NASA Technical Reports Server (NTRS)
Schutz, Bob E.
1993-01-01
Satellite Laser Ranging (SLR) has a rich history of development which began in the 1960s with 10 meter-level first generation systems. These systems evolved with order of magnitude improvements to the systems that now produce several millimeter single shot range precisions. What began, in part, as an interesting application of the new laser technology has become an essential component of modern, precision space geodesy, which in turn enables contributions to a variety of science areas. Modern space geodesy is the beneficiary of technological developments which have enabled precision geodetic measurements. Aside from SLR and its closely related technique, Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI) has made prominent science contributions also. In recent years, the Global Positioning System (GPS) has demonstrated a rapidly growing popularity as the result of demonstrated low cost with high precision instrumentation. Other modern techniques such as DORIS have demonstrated the ability to make significant science contributions; furthermore, PRARE can be expected to contribute in its own right. An appropriate question is 'why should several techniques be financially supported'? While there are several answers, I offer the opinion that, in consideration of the broad science areas that are the benefactors of space geodesy, no single technique can meet all the requirements and/or expectations of the science areas in which space geodesy contributes or has the potential for contributing. The more well-known science areas include plate tectonics, earthquake processes, Earth rotation/orientation, gravity (static and temporal), ocean circulation, land, and ice topography, to name a few applications. It is unfortunate that the modern space geodesy techniques are often viewed as competitive, but this view is usually encouraged by funding competition, especially in an era of growing needs but diminishing budgets. The techniques are, for the most part, complementary and the ability to reduce the data to geodetic parameters from several techniques promotes confidence in the geophysical interpretations. In the following sections, the current SLR applications are reviewed in the context of the other techniques. The strengths and limitations of SLR are reviewed and speculation about the future prospects are offered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraisler, Eli; Kronik, Leeor
2014-05-14
The fundamental gap is a central quantity in the electronic structure of matter. Unfortunately, the fundamental gap is not generally equal to the Kohn-Sham gap of density functional theory (DFT), even in principle. The two gaps differ precisely by the derivative discontinuity, namely, an abrupt change in slope of the exchange-correlation energy as a function of electron number, expected across an integer-electron point. Popular approximate functionals are thought to be devoid of a derivative discontinuity, strongly compromising their performance for prediction of spectroscopic properties. Here we show that, in fact, all exchange-correlation functionals possess a derivative discontinuity, which arises naturallymore » from the application of ensemble considerations within DFT, without any empiricism. This derivative discontinuity can be expressed in closed form using only quantities obtained in the course of a standard DFT calculation of the neutral system. For small, finite systems, addition of this derivative discontinuity indeed results in a greatly improved prediction for the fundamental gap, even when based on the most simple approximate exchange-correlation density functional – the local density approximation (LDA). For solids, the same scheme is exact in principle, but when applied to LDA it results in a vanishing derivative discontinuity correction. This failure is shown to be directly related to the failure of LDA in predicting fundamental gaps from total energy differences in extended systems.« less
Tan, Michael; Wilson, Ian; Braganza, Vanessa; Ignatiadis, Sophia; Boston, Ray; Sundararajan, Vijaya; Cook, Mark J; D'Souza, Wendyl J
2015-10-01
We report the diagnostic validity of a selection algorithm for identifying epilepsy cases. Retrospective validation study of International Classification of Diseases 10th Revision Australian Modification (ICD-10AM)-coded hospital records and pharmaceutical data sampled from 300 consecutive potential epilepsy-coded cases and 300 randomly chosen cases without epilepsy from 3/7/2012 to 10/7/2013. Two epilepsy specialists independently validated the diagnosis of epilepsy. A multivariable logistic regression model was fitted to identify the optimum coding algorithm for epilepsy and was internally validated. One hundred fifty-eight out of three hundred (52.6%) epilepsy-coded records and 0/300 (0%) nonepilepsy records were confirmed to have epilepsy. The kappa for interrater agreement was 0.89 (95% CI=0.81-0.97). The model utilizing epilepsy (G40), status epilepticus (G41) and ≥1 antiepileptic drug (AED) conferred the highest positive predictive value of 81.4% (95% CI=73.1-87.9) and a specificity of 99.9% (95% CI=99.9-100.0). The area under the receiver operating curve was 0.90 (95% CI=0.88-0.93). When combined with pharmaceutical data, the precision of case identification for epilepsy data linkage design was considerably improved and could provide considerable potential for efficient and reasonably accurate case ascertainment in epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Regulatory Aspects of Optical Methods and Exogenous Targets for Cancer Detection
Tummers, Willemieke S.; Warram, Jason M.; Tipirneni, Kiranya E.; Fengler, John; Jacobs, Paula; Shankar, Lalitha; Henderson, Lori; Ballard, Betsy; Pogue, Brian W.; Weichert, Jamey P.; Bouvet, Michael; Sorger, Jonathan; Contag, Christopher H.; Frangioni, John V.; Tweedle, Michael F.; Basilion, James P.; Gambhir, Sanjiv S.; Rosenthal, Eben L.
2017-01-01
Considerable advances in cancer-specific optical imaging have improved the precision of tumor resection. In comparison to traditional imaging modalities, this technology is unique in its ability to provide real-time feedback to the operating surgeon. Given the significant clinical implications of optical imaging, there is an urgent need to standardize surgical navigation tools and contrast agents to facilitate swift regulatory approval. Because fluorescence-enhanced surgery requires a combination of both device and drug, each may be developed in conjunction, or separately, which are important considerations in the approval process. This report is the result of a one-day meeting held on May 4, 2016 with officials from the National Cancer Institute, the FDA, members of the American Society of Image-Guided Surgery, and members of the World Molecular Imaging Society, which discussed consensus methods for FDA-directed human testing and approval of investigational optical imaging devices as well as contrast agents for surgical applications. The goal of this workshop was to discuss FDA approval requirements and the expectations for approval of these novel drugs and devices, packaged separately or in combination, within the context of optical surgical navigation. In addition, the workshop acted to provide clarity to the research community on data collection and trial design. Reported here are the specific discussion items and recommendations from this critical and timely meeting. PMID:28428283
A new perspective on nonprescription statins: an opportunity for patient education and involvement.
Fuster, Valentin
2007-09-01
Education of the public and encouragement of patients' involvement in their own health care have been repeatedly proved effective means of increasing health awareness, promoting lifestyle modifications, and improving early disease detection in a variety of clinical scenarios. Despite substantial efforts from different public and private organizations to educate the population on cardiovascular risk, coronary heart disease remains the leading cause of death in the United States, and its prevalence continues to grow. Therefore, alternative approaches with the potential to elicit a meaningful impact in the community deserve consideration. A nonprescription statin program could provide consumers with a tool of proved benefit in cardiovascular risk prevention. The magnitude of the target population (millions of subjects with intermediate to high risk), as well as the safety and efficacy profile of lovastatin 20 mg, support the consideration of this drug for "over-the-counter" availability. Moreover, a nonprescription statin program could represent a unique opportunity not only to enhance patients' involvement in primary prevention but also to reinforce the education of the public and to encourage interaction with health care providers. The success of such a program will undoubtedly require precise labeling of the risks and benefits of the therapy, as well as active support and participation from major medical organizations. In conclusion, nonprescription statin availability, through enhanced unique patients' involvement, offers the potential for enormous public health benefit.
Water as consumed and its impact on the consumer--do we understand the variables?
Bates, A J
2000-01-01
Water is the most important natural resource in the world, without it life cannot exist. In 1854 a cholera outbreak in London caused 10, 000 deaths and positively linked enteric disease with bacterial contamination of drinking water by sewage pollution. Since then, adequate water hygiene standards and sewage purification have played the most significant role in disease eradication and public health improvements everywhere. Standards for drinking water have become an extensive range of microbiological and chemical parametric values. Which has not increased consumer, if the media is to be believed. Customers rightly expect that the water they drink is safe and wholesome. Standard setting is perceived as a precise science and meaningful to health. Is this justified and do scientists and regulators who derive and set the standards understand the uncertainties in the system? Water is the universal solvent, therefore it will never be pure; it will contain impurities prior to and after treatment. Knowledge of its potential to become contaminated is necessary to understand the epidemiology associated with waterborne contaminants and their effects. Water use patterns vary considerably and affect assumptions based on toxicology derived from laboratory studies under tightly controlled conditions. Consideration must be given to the model systems used to assess toxicity and translate results from the laboratory to the real world, if sensible scientifically-based water quality standards are to be set and achieved cost effectively.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
Localization of an Underwater Control Network Based on Quasi-Stable Adjustment.
Zhao, Jianhu; Chen, Xinhua; Zhang, Hongmei; Feng, Jie
2018-03-23
There exists a common problem in the localization of underwater control networks that the precision of the absolute coordinates of known points obtained by marine absolute measurement is poor, and it seriously affects the precision of the whole network in traditional constraint adjustment. Therefore, considering that the precision of underwater baselines is good, we use it to carry out quasi-stable adjustment to amend known points before constraint adjustment so that the points fit the network shape better. In addition, we add unconstrained adjustment for quality control of underwater baselines, the observations of quasi-stable adjustment and constrained adjustment, to eliminate the unqualified baselines and improve the results' accuracy of the two adjustments. Finally, the modified method is applied to a practical LBL (Long Baseline) experiment and obtains a mean point location precision of 0.08 m, which improves by 38% compared with the traditional method.
Localization of an Underwater Control Network Based on Quasi-Stable Adjustment
Chen, Xinhua; Zhang, Hongmei; Feng, Jie
2018-01-01
There exists a common problem in the localization of underwater control networks that the precision of the absolute coordinates of known points obtained by marine absolute measurement is poor, and it seriously affects the precision of the whole network in traditional constraint adjustment. Therefore, considering that the precision of underwater baselines is good, we use it to carry out quasi-stable adjustment to amend known points before constraint adjustment so that the points fit the network shape better. In addition, we add unconstrained adjustment for quality control of underwater baselines, the observations of quasi-stable adjustment and constrained adjustment, to eliminate the unqualified baselines and improve the results’ accuracy of the two adjustments. Finally, the modified method is applied to a practical LBL (Long Baseline) experiment and obtains a mean point location precision of 0.08 m, which improves by 38% compared with the traditional method. PMID:29570627
Powell, A E; Davies, H T O; Bannister, J; Macrae, W A
2009-06-01
Previous national survey research has shown significant deficits in routine postoperative pain management in the UK. This study used an organizational change perspective to explore in detail the organizational challenges faced by three acute pain services in improving postoperative pain management. Case studies were conducted comprising documentary review and semi-structured interviews (71) with anaesthetists, surgeons, nurses, other health professionals, and managers working in and around three broadly typical acute pain services. Although the precise details differed to some degree, the three acute pain services all faced the same broad range of inter-related challenges identified in the organizational change literature (i.e. structural, political, cultural, educational, emotional, and physical/technological challenges). The services were largely isolated from wider organizational objectives and activities and struggled to engage other health professionals in improving postoperative pain management against a background of limited resources, turbulent organizational change, and inter- and intra-professional politics. Despite considerable efforts they struggled to address these challenges effectively. The literature on organizational change and quality improvement in health care suggests that it is only by addressing the multiple challenges in a comprehensive way across all levels of the organization and health-care system that sustained improvements in patient care can be secured. This helps to explain why the hard work and commitment of acute pain services over the years have not always resulted in significant improvements in routine postoperative pain management for all surgical patients. Using this literature and adopting a whole-organization quality improvement approach tailored to local circumstances may produce a step-change in the quality of routine postoperative pain management.
NASA Astrophysics Data System (ADS)
Skeffington, R. A.; Halliday, S. J.; Wade, A. J.; Bowes, M. J.; Loewenthal, M.
2015-05-01
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Spatially distributed modeling of soil organic carbon across China with improved accuracy
NASA Astrophysics Data System (ADS)
Li, Qi-quan; Zhang, Hao; Jiang, Xin-ye; Luo, Youlin; Wang, Chang-quan; Yue, Tian-xiang; Li, Bing; Gao, Xue-song
2017-06-01
There is a need for more detailed spatial information on soil organic carbon (SOC) for the accurate estimation of SOC stock and earth system models. As it is effective to use environmental factors as auxiliary variables to improve the prediction accuracy of spatially distributed modeling, a combined method (HASM_EF) was developed to predict the spatial pattern of SOC across China using high accuracy surface modeling (HASM), artificial neural network (ANN), and principal component analysis (PCA) to introduce land uses, soil types, climatic factors, topographic attributes, and vegetation cover as predictors. The performance of HASM_EF was compared with ordinary kriging (OK), OK, and HASM combined, respectively, with land uses and soil types (OK_LS and HASM_LS), and regression kriging combined with land uses and soil types (RK_LS). Results showed that HASM_EF obtained the lowest prediction errors and the ratio of performance to deviation (RPD) presented the relative improvements of 89.91%, 63.77%, 55.86%, and 42.14%, respectively, compared to the other four methods. Furthermore, HASM_EF generated more details and more realistic spatial information on SOC. The improved performance of HASM_EF can be attributed to the introduction of more environmental factors, to explicit consideration of the multicollinearity of selected factors and the spatial nonstationarity and nonlinearity of relationships between SOC and selected factors, and to the performance of HASM and ANN. This method may play a useful tool in providing more precise spatial information on soil parameters for global modeling across large areas.
Peak Measurement for Vancomycin AUC Estimation in Obese Adults Improves Precision and Lowers Bias.
Pai, Manjunath P; Hong, Joseph; Krop, Lynne
2017-04-01
Vancomycin area under the curve (AUC) estimates may be skewed in obese adults due to weight-dependent pharmacokinetic parameters. We demonstrate that peak and trough measurements reduce bias and improve the precision of vancomycin AUC estimates in obese adults ( n = 75) and validate this in an independent cohort ( n = 31). The precision and mean percent bias of Bayesian vancomycin AUC estimates are comparable between covariate-dependent ( R 2 = 0.774, 3.55%) and covariate-independent ( R 2 = 0.804, 3.28%) models when peaks and troughs are measured but not when measurements are restricted to troughs only ( R 2 = 0.557, 15.5%). Copyright © 2017 American Society for Microbiology.
NASA Technical Reports Server (NTRS)
Lewandowski, W.
1994-01-01
The introduction of the GPS common-view method at the beginning of the 1980's led to an immediate and dramatic improvement of international time comparisons. Since then, further progress brought the precision and accuracy of GPS common-view intercontinental time transfer from tens of nanoseconds to a few nanoseconds, even with SA activated. This achievement was made possible by the use of the following: ultra-precise ground antenna coordinates, post-processed precise ephemerides, double-frequency measurements of ionosphere, and appropriate international coordination and standardization. This paper reviews developments and applications of the GPS common-view method during the last decade and comments on possible future improvements whose objective is to attain sub-nanosecond uncertainty.
[Precision nutrition in the era of precision medicine].
Chen, P Z; Wang, H
2016-12-06
Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.
Performance Analysis of Classification Methods for Indoor Localization in Vlc Networks
NASA Astrophysics Data System (ADS)
Sánchez-Rodríguez, D.; Alonso-González, I.; Sánchez-Medina, J.; Ley-Bosch, C.; Díaz-Vilariño, L.
2017-09-01
Indoor localization has gained considerable attention over the past decade because of the emergence of numerous location-aware services. Research works have been proposed on solving this problem by using wireless networks. Nevertheless, there is still much room for improvement in the quality of the proposed classification models. In the last years, the emergence of Visible Light Communication (VLC) brings a brand new approach to high quality indoor positioning. Among its advantages, this new technology is immune to electromagnetic interference and has the advantage of having a smaller variance of received signal power compared to RF based technologies. In this paper, a performance analysis of seventeen machine leaning classifiers for indoor localization in VLC networks is carried out. The analysis is accomplished in terms of accuracy, average distance error, computational cost, training size, precision and recall measurements. Results show that most of classifiers harvest an accuracy above 90 %. The best tested classifier yielded a 99.0 % accuracy, with an average error distance of 0.3 centimetres.
RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.
Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K
2014-10-01
RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.
Asynchronous adaptive time step in quantitative cellular automata modeling
Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan
2004-01-01
Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901
Fornaro, Michele; Kardash, Lubna; Novello, Stefano; Fusco, Andrea; Anastasia, Annalisa; De Berardis, Domenico; Perna, Giampaolo; Carta, Mauro Giovanni
2018-03-01
Bipolar disorder (BD) is a considerable burden to the affected individual. The need for novel drug targets and improved drug design (DD) in BD is therefore clear. Areas covered: The following article provides a brief, narrative, clinician-oriented overview of the most promising novel pharmacological targets for BD along with a concise overview regarding the general DD process and the unmet needs relevant to BD. Expert opinion: A number of novel potential drug targets have been investigated. With the notable exception of the kynurenine pathway, available evidence is too scarce to highlight a definitive roadmap for forthcoming DD in BD. BD itself may present with different facets, as it is a polymorphic clinical spectrum. Therefore, promoting clinical-case stratification should be based on precision medicine, rather than on novel biological targets. Furthermore, the full release of raw study data to the scientific community and the development of uniform clinical trial standards (including more realistic outcomes) should be promoted to facilitate the DD process in BD.
Boosting quantum annealer performance via sample persistence
NASA Astrophysics Data System (ADS)
Karimi, Hamed; Rosenberg, Gili
2017-07-01
We propose a novel method for reducing the number of variables in quadratic unconstrained binary optimization problems, using a quantum annealer (or any sampler) to fix the value of a large portion of the variables to values that have a high probability of being optimal. The resulting problems are usually much easier for the quantum annealer to solve, due to their being smaller and consisting of disconnected components. This approach significantly increases the success rate and number of observations of the best known energy value in samples obtained from the quantum annealer, when compared with calling the quantum annealer without using it, even when using fewer annealing cycles. Use of the method results in a considerable improvement in success metrics even for problems with high-precision couplers and biases, which are more challenging for the quantum annealer to solve. The results are further enhanced by applying the method iteratively and combining it with classical pre-processing. We present results for both Chimera graph-structured problems and embedded problems from a real-world application.
Genetics and Molecular Pathogenesis of Gastric Adenocarcinoma.
Tan, Patrick; Yeoh, Khay-Guan
2015-10-01
Gastric cancer (GC) is globally the fifth most common cancer and third leading cause of cancer death. A complex disease arising from the interaction of environmental and host-associated factors, key contributors to GC's high mortality include its silent nature, late clinical presentation, and underlying biological and genetic heterogeneity. Achieving a detailed molecular understanding of the various genomic aberrations associated with GC will be critical to improving patient outcomes. The recent years has seen considerable progress in deciphering the genomic landscape of GC, identifying new molecular components such as ARID1A and RHOA, cellular pathways, and tissue populations associated with gastric malignancy and progression. The Cancer Genome Atlas (TCGA) project is a landmark in the molecular characterization of GC. Key challenges for the future will involve the translation of these molecular findings to clinical utility, by enabling novel strategies for early GC detection, and precision therapies for individual GC patients. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.
Analysis of Waveform Retracking Methods in Antarctic Ice Sheet Based on CRYOSAT-2 Data
NASA Astrophysics Data System (ADS)
Xiao, F.; Li, F.; Zhang, S.; Hao, W.; Yuan, L.; Zhu, T.; Zhang, Y.; Zhu, C.
2017-09-01
Satellite altimetry plays an important role in many geoscientific and environmental studies of Antarctic ice sheet. The ranging accuracy is degenerated near coasts or over nonocean surfaces, due to waveform contamination. A postprocess technique, known as waveform retracking, can be used to retrack the corrupt waveform and in turn improve the ranging accuracy. In 2010, the CryoSat-2 satellite was launched with the Synthetic aperture Interferometric Radar ALtimeter (SIRAL) onboard. Satellite altimetry waveform retracking methods are discussed in the paper. Six retracking methods including the OCOG method, the threshold method with 10 %, 25 % and 50 % threshold level, the linear and exponential 5-β parametric methods are used to retrack CryoSat-2 waveform over the transect from Zhongshan Station to Dome A. The results show that the threshold retracker performs best with the consideration of waveform retracking success rate and RMS of retracking distance corrections. The linear 5-β parametric retracker gives best waveform retracking precision, but cannot make full use of the waveform data.
Accurately measuring volcanic plume velocity with multiple UV spectrometers
Williams-Jones, Glyn; Horton, Keith A.; Elias, Tamar; Garbeil, Harold; Mouginis-Mark, Peter J; Sutton, A. Jeff; Harris, Andrew J. L.
2006-01-01
A fundamental problem with all ground-based remotely sensed measurements of volcanic gas flux is the difficulty in accurately measuring the velocity of the gas plume. Since a representative wind speed and direction are used as proxies for the actual plume velocity, there can be considerable uncertainty in reported gas flux values. Here we present a method that uses at least two time-synchronized simultaneously recording UV spectrometers (FLYSPECs) placed a known distance apart. By analyzing the time varying structure of SO2 concentration signals at each instrument, the plume velocity can accurately be determined. Experiments were conducted on Kīlauea (USA) and Masaya (Nicaragua) volcanoes in March and August 2003 at plume velocities between 1 and 10 m s−1. Concurrent ground-based anemometer measurements differed from FLYSPEC-measured plume speeds by up to 320%. This multi-spectrometer method allows for the accurate remote measurement of plume velocity and can therefore greatly improve the precision of volcanic or industrial gas flux measurements.
On a methodology for robust segmentation of nonideal iris images.
Schmid, Natalia A; Zuo, Jinyu
2010-06-01
Iris biometric is one of the most reliable biometrics with respect to performance. However, this reliability is a function of the ideality of the data. One of the most important steps in processing nonideal data is reliable and precise segmentation of the iris pattern from remaining background. In this paper, a segmentation methodology that aims at compensating various nonidealities contained in iris images during segmentation is proposed. The virtue of this methodology lies in its capability to reliably segment nonideal imagery that is simultaneously affected with such factors as specular reflection, blur, lighting variation, occlusion, and off-angle images. We demonstrate the robustness of our segmentation methodology by evaluating ideal and nonideal data sets, namely, the Chinese Academy of Sciences iris data version 3 interval subdirectory, the iris challenge evaluation data, the West Virginia University (WVU) data, and the WVU off-angle data. Furthermore, we compare our performance to that of our implementation of Camus and Wildes's algorithm and Masek's algorithm. We demonstrate considerable improvement in segmentation performance over the formerly mentioned algorithms.
A portable analyser for the measurement of ammonium in marine waters.
Amornthammarong, Natchanon; Zhang, Jia-Zhong; Ortner, Peter B; Stamates, Jack; Shoemaker, Michael; Kindel, Michael W
2013-03-01
A portable ammonium analyser was developed and used to measure in situ ammonium in the marine environment. The analyser incorporates an improved LED photodiode-based fluorescence detector (LPFD). This system is more sensitive and considerably smaller than previous systems and incorporates a pre-filtering subsystem enabling measurements in turbid, sediment-laden waters. Over the typical range for ammonium in marine waters (0–10 mM), the response is linear (r(2) = 0.9930) with a limit of detection (S/N ratio > 3) of 10 nM. The working range for marine waters is 0.05–10 mM. Repeatability is 0.3% (n =10) at an ammonium level of 2 mM. Results from automated operation in 15 min cycles over 16 days had good overall precision (RSD = 3%, n = 660). The system was field tested at three shallow South Florida sites. Diurnal cycles and possibly a tidal influence were expressed in the concentration variability observed.
Zhang, Ding Sheng-Zi; Jiang, Yang; Wei, Dan; Wei, Xunbin; Xu, Hong; Gu, Hongchen
2018-06-21
With the increasing demands for high-throughput multiplexed bioassays, quantum dot (QD)-encoded microbeads as biocarriers for various bioreactions have attracted considerable attention. However, three key requirements for these biocarriers are still longstanding issues: a stable fluorescence intensity, a large encoding capacity and abundant surface functional groups. Here, a novel one-pot strategy is developed, generating functionalized QD-encoded microspheres with a strong fluorescence intensity and optical stability. With poly(styrene-co-maleic anhydride) (PSMA) molecules as mediators, the encapsulation of QDs and carboxylation of the bead surface are integrated together, greatly improving the preparation efficiency and guaranteeing their potential application in biodetection. Moreover, the mechanism for preparing QD-doped beads is further proposed, which helps to precisely manipulate the preparation process and accurately encode the beads. Through this approach, a single- and dual-color barcode library of QD-encoded microspheres has been successfully established, which demonstrates their great potential in suspension arrays.
Transient recycling of resected bone to facilitate mandibular reconstruction--a technical note.
Lee, Jing-Wei; Tsai, Shin-Sheng; Kuo, Yao-Lung
2006-10-01
Mandibular reconstruction requires considerable sculptural skills. The intriguingly complex configuration of the structure is difficult to reproduce. It is thus imperative for surgeons to seek a technique that improves the precision of the reconstruction. A 55-year-old male presented with a full thickness cancer (T4+) of his left cheek. Radical ablative surgery resulted in an extensive loss of bone and soft tissue mandating major reconstruction. The resected bony specimen was thoroughly denuded, autoclaved, and then placed back into its original site so that the mandible resumed its pre-surgical configuration. A reconstruction plate was applied to maintain structural stability, then the "recycled bone" was used as a template and replaced with a free fibular graft. The patient fared well and a follow-up panoramic radiograph demonstrated good alignment and symmetry of the reconstructed mandible. This method is a viable option for segmental mandibulectomy defect repair in selected cases. Using this technique, it is possible to restore the original bony contour expediently and accurately.
High-precision x-ray spectroscopy of highly charged ions with microcalorimeters
NASA Astrophysics Data System (ADS)
Kraft-Bermuth, S.; Andrianov, V.; Bleile, A.; Echler, A.; Egelhof, P.; Grabitz, P.; Ilieva, S.; Kilbourne, C.; Kiselev, O.; McCammon, D.; Meier, J.
2013-09-01
The precise determination of the energy of the Lyman α1 and α2 lines in hydrogen-like heavy ions provides a sensitive test of quantum electrodynamics in very strong Coulomb fields. To improve the experimental precision, the new detector concept of microcalorimeters is now exploited for such measurements. Such detectors consist of compensated-doped silicon thermistors and Pb or Sn absorbers to obtain high quantum efficiency in the energy range of 40-70 keV, where the Doppler-shifted Lyman lines are located. For the first time, a microcalorimeter was applied in an experiment to precisely determine the transition energy of the Lyman lines of lead ions at the experimental storage ring at GSI. The energy of the Ly α1 line E(Ly-α1, 207Pb81+) = (77937 ± 12stat ± 25syst) eV agrees within error bars with theoretical predictions. To improve the experimental precision, a new detector array with more pixels and better energy resolution was equipped and successfully applied in an experiment to determine the Lyman-α lines of gold ions 197Au78+.
Dotette: Programmable, high-precision, plug-and-play droplet pipetting.
Fan, Jinzhen; Men, Yongfan; Hao Tseng, Kuo; Ding, Yi; Ding, Yunfeng; Villarreal, Fernando; Tan, Cheemeng; Li, Baoqing; Pan, Tingrui
2018-05-01
Manual micropipettes are the most heavily used liquid handling devices in biological and chemical laboratories; however, they suffer from low precision for volumes under 1 μ l and inevitable human errors. For a manual device, the human errors introduced pose potential risks of failed experiments, inaccurate results, and financial costs. Meanwhile, low precision under 1 μ l can cause severe quantification errors and high heterogeneity of outcomes, becoming a bottleneck of reaction miniaturization for quantitative research in biochemical labs. Here, we report Dotette, a programmable, plug-and-play microfluidic pipetting device based on nanoliter liquid printing. With automated control, protocols designed on computers can be directly downloaded into Dotette, enabling programmable operation processes. Utilizing continuous nanoliter droplet dispensing, the precision of the volume control has been successfully improved from traditional 20%-50% to less than 5% in the range of 100 nl to 1000 nl. Such a highly automated, plug-and-play add-on to existing pipetting devices not only improves precise quantification in low-volume liquid handling and reduces chemical consumptions but also facilitates and automates a variety of biochemical and biological operations.
Holman, B W B; Alvarenga, T I R C; van de Ven, R J; Hopkins, D L
2015-07-01
The Warner-Bratzler shear force (WBSF) of 335 lamb m. longissimus lumborum (LL) caudal and cranial ends was measured to examine and simulate the effect of replicate number (r: 1-8) on the precision of mean WBSF estimates and to compare LL caudal and cranial end WBSF means. All LL were sourced from two experimental flocks as part of the Information Nucleus slaughter programme (CRC for Sheep Industry Innovation) and analysed using a Lloyd Texture analyser with a Warner-Bratzler blade attachment. WBSF data were natural logarithm (ln) transformed before statistical analysis. Mean ln(WBSF) precision improved as r increased; however the practical implications support an r equal to 6, as precision improves only marginally with additional replicates. Increasing LL sample replication results in better ln(WBSF) precision compared with increasing r, provided that sample replicates are removed from the same LL end. Cranial end mean WBSF was 11.2 ± 1.3% higher than the caudal end. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Dong, Ling-Bo; Liu, Zhao-Gang; Li, Feng-Ri; Jiang, Li-Chun
2013-09-01
By using the branch analysis data of 955 standard branches from 60 sampled trees in 12 sampling plots of Pinus koraiensis plantation in Mengjiagang Forest Farm in Heilongjiang Province of Northeast China, and based on the linear mixed-effect model theory and methods, the models for predicting branch variables, including primary branch diameter, length, and angle, were developed. Considering tree effect, the MIXED module of SAS software was used to fit the prediction models. The results indicated that the fitting precision of the models could be improved by choosing appropriate random-effect parameters and variance-covariance structure. Then, the correlation structures including complex symmetry structure (CS), first-order autoregressive structure [AR(1)], and first-order autoregressive and moving average structure [ARMA(1,1)] were added to the optimal branch size mixed-effect model. The AR(1) improved the fitting precision of branch diameter and length mixed-effect model significantly, but all the three structures didn't improve the precision of branch angle mixed-effect model. In order to describe the heteroscedasticity during building mixed-effect model, the CF1 and CF2 functions were added to the branch mixed-effect model. CF1 function improved the fitting effect of branch angle mixed model significantly, whereas CF2 function improved the fitting effect of branch diameter and length mixed model significantly. Model validation confirmed that the mixed-effect model could improve the precision of prediction, as compare to the traditional regression model for the branch size prediction of Pinus koraiensis plantation.
Tactile display landing safety and precision improvements for the Space Shuttle
NASA Astrophysics Data System (ADS)
Olson, John M.
A tactile display belt using 24 electro-mechanical tactile transducers (tactors) was used to determine if a modified tactile display system, known as the Tactile Situation Awareness System (TSAS) improved the safety and precision of a complex spacecraft (i.e. the Space Shuttle Orbiter) in guided precision approaches and landings. The goal was to determine if tactile cues enhance safety and mission performance through reduced workload, increased situational awareness (SA), and an improved operational capability by increasing secondary cognitive workload capacity and human-machine interface efficiency and effectiveness. Using both qualitative and quantitative measures such as NASA's Justiz Numerical Measure and Synwork1 scores, an Overall Workload (OW) measure, the Cooper-Harper rating scale, and the China Lake Situational Awareness scale, plus Pre- and Post-Flight Surveys, the data show that tactile displays decrease OW, improve SA, counteract fatigue, and provide superior warning and monitoring capacity for dynamic, off-nominal, high concurrent workload scenarios involving complex, cognitive, and multi-sensory critical scenarios. Use of TSAS for maintaining guided precision approaches and landings was generally intuitive, reduced training times, and improved task learning effects. Ultimately, the use of a homogeneous, experienced, and statistically robust population of test pilots demonstrated that the use of tactile displays for Space Shuttle approaches and landings with degraded vehicle systems, weather, and environmental conditions produced substantial improvements in safety, consistency, reliability, and ease of operations under demanding conditions. Recommendations for further analysis and study are provided in order to leverage the results from this research and further explore the potential to reduce the risk of spaceflight and aerospace operations in general.
High Precision Piezoelectric Linear Motors for Operations at Cryogenic Temperatures and Vacuum
NASA Technical Reports Server (NTRS)
Wong, D.; Carman, G.; Stam, M.; Bar-Cohen, Y.; Sen, A.; Henry, P.; Bearman, G.; Moacanin, J.
1995-01-01
The use of an electromechanical device for optically positioning a mirror system during the pre-project phase of the Pluto Fast Flyby mission was evaluated at JPL. The device under consideration was a piezoelectric driven linear motor functionally dependent upon a time varying electric field which induces displacements ranging from submicrons to millimeters with positioning accuracy within nanometers.
ERIC Educational Resources Information Center
Terzi, Lorella
2007-01-01
The ideal of educational equality is fundamentally grounded in the egalitarian principle that social and institutional arrangements should be designed to give equal consideration to "all". However, beyond this broad stipulation, the precise content of the ideal of educational equality is more difficult to determine. In this article, I aim to…
The Education of Royalty in the Eighteenth Century: George IV and William IV
ERIC Educational Resources Information Center
Clarke, M. L.
1978-01-01
George IV, the Prince of Wales, and William IV, his younger brother, both the sons of George III, were given all the educational advantage one could be granted in the eighteenth century. The precise curriculum and practices of their teachers are discussed with an evaluation of both students as a moral for future consideration. (RK)
ERIC Educational Resources Information Center
National Heart, Lung, and Blood Inst. (DHHS/NIH), Bethesda, MD.
Precise and accurate cholesterol measurements are required to identify and treat individuals with high blood cholesterol levels. However, the current state of reliability of blood cholesterol measurements suggests that considerable inaccuracy in cholesterol testing exists. This report describes the Laboratory Standardization Panel findings on the…
Sensitivity of leaf size and shape to climate: Global patterns and paleoclimatic applications
Peppe, D.J.; Royer, D.L.; Cariglino, B.; Oliver, S.Y.; Newman, S.; Leight, E.; Enikolopov, G.; Fernandez-Burgos, M.; Herrera, F.; Adams, J.M.; Correa, E.; Currano, E.D.; Erickson, J.M.; Hinojosa, L.F.; Hoganson, J.W.; Iglesias, A.; Jaramillo, C.A.; Johnson, K.R.; Jordan, G.J.; Kraft, N.J.B.; Lovelock, E.C.; Lusk, C.H.; Niinemets, U.; Penuelas, J.; Rapson, G.; Wing, S.L.; Wright, I.J.
2011-01-01
Paleobotanists have long used models based on leaf size and shape to reconstruct paleoclimate. However, most models incorporate a single variable or use traits that are not physiologically or functionally linked to climate, limiting their predictive power. Further, they often underestimate paleotemperature relative to other proxies. Here we quantify leaf-climate correlations from 92 globally distributed, climatically diverse sites, and explore potential confounding factors. Multiple linear regression models for mean annual temperature (MAT) and mean annual precipitation (MAP) are developed and applied to nine well-studied fossil floras. We find that leaves in cold climates typically have larger, more numerous teeth, and are more highly dissected. Leaf habit (deciduous vs evergreen), local water availability, and phylogenetic history all affect these relationships. Leaves in wet climates are larger and have fewer, smaller teeth. Our multivariate MAT and MAP models offer moderate improvements in precision over univariate approaches (??4.0 vs 4.8??C for MAT) and strong improvements in accuracy. For example, our provisional MAT estimates for most North American fossil floras are considerably warmer and in better agreement with independent paleoclimate evidence. Our study demonstrates that the inclusion of additional leaf traits that are functionally linked to climate improves paleoclimate reconstructions. This work also illustrates the need for better understanding of the impact of phylogeny and leaf habit on leaf-climate relationships. ?? 2011 The Authors. New Phytologist ?? 2011 New Phytologist Trust.
Using commodity accelerometers and gyroscopes to improve speed and accuracy of JanusVF
NASA Astrophysics Data System (ADS)
Hutson, Malcolm; Reiners, Dirk
2010-01-01
Several critical limitations exist in the currently available commercial tracking technologies for fully-enclosed virtual reality (VR) systems. While several 6DOF solutions can be adapted to work in fully-enclosed spaces, they still include elements of hardware that can interfere with the user's visual experience. JanusVF introduced a tracking solution for fully-enclosed VR displays that achieves comparable performance to available commercial solutions but without artifacts that can obscure the user's view. JanusVF employs a small, high-resolution camera that is worn on the user's head, but faces backwards. The VR rendering software draws specific fiducial markers with known size and absolute position inside the VR scene behind the user but in view of the camera. These fiducials are tracked by ARToolkitPlus and integrated by a single-constraint-at-a-time (SCAAT) filter to update the head pose. In this paper we investigate the addition of low-cost accelerometers and gyroscopes such as those in Nintendo Wii remotes, the Wii Motion Plus, and the Sony Sixaxis controller to improve the precision and accuracy of JanusVF. Several enthusiast projects have implemented these units as basic trackers or for gesture recognition, but none so far have created true 6DOF trackers using only the accelerometers and gyroscopes. Our original experiments were repeated after adding the low-cost inertial sensors, showing considerable improvements and noise reduction.
Management of recurrent or metastatic thyroid cancer.
Tahara, Makoto
2018-01-01
Recently, vascular endothelial growth factor receptor (VEGFR)-targeted tyrosine kinase inhibitors (TKIs) have become available for the treatment of recurrent or metastatic thyroid cancer. However, a number of clinical challenges that impact the use of VEGFR-targeted TKI in daily clinical practice have arisen. Toxicity is considerable, to the extent that most physicians hesitate to start VEGFR-targeted TKI and prefer to continue a watch-and-wait approach until the patient's disease markedly worsens. This delayed use of VEGFR-targeted TKI leads to a higher incidence of serious adverse events than was reported in clinical trials. Moreover, the watch-and-wait approach has several demerits, including a worsening of quality of life, worsening of outcomes in patients of older age or with follicular thyroid cancer and increased risk of brain metastasis or bleeding. Thus, optimal timing for the start of VEGFR-targeted TKI requires careful consideration. Moreover, management of VEGFR-targeted TKI toxicities requires appropriate supportive care, well-organised infrastructure in the outpatient clinic and patient education. Future treatment will progress to precision medicine based on molecular testing. Promotion of precision medicine requires the establishment of a system of easy access to molecular testing and the promotion of translational research for the development of new drugs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Rongyu; Zhao, Changyin; Zhang, Xiaoxiang, E-mail: cyzhao@pmo.ac.cn
The data reduction method for optical space debris observations has many similarities with the one adopted for surveying near-Earth objects; however, due to several specific issues, the image degradation is particularly critical, which makes it difficult to obtain precise astrometry. An automatic image reconstruction method was developed to improve the astrometry precision for space debris, based on the mathematical morphology operator. Variable structural elements along multiple directions are adopted for image transformation, and then all the resultant images are stacked to obtain a final result. To investigate its efficiency, trial observations are made with Global Positioning System satellites and themore » astrometry accuracy improvement is obtained by comparison with the reference positions. The results of our experiments indicate that the influence of degradation in astrometric CCD images is reduced, and the position accuracy of both objects and stellar stars is improved distinctly. Our technique will contribute significantly to optical data reduction and high-order precision astrometry for space debris.« less
Study on application of adaptive fuzzy control and neural network in the automatic leveling system
NASA Astrophysics Data System (ADS)
Xu, Xiping; Zhao, Zizhao; Lan, Weiyong; Sha, Lei; Qian, Cheng
2015-04-01
This paper discusses the adaptive fuzzy control and neural network BP algorithm in large flat automatic leveling control system application. The purpose is to develop a measurement system with a flat quick leveling, Make the installation on the leveling system of measurement with tablet, to be able to achieve a level in precision measurement work quickly, improve the efficiency of the precision measurement. This paper focuses on the automatic leveling system analysis based on fuzzy controller, Use of the method of combining fuzzy controller and BP neural network, using BP algorithm improve the experience rules .Construct an adaptive fuzzy control system. Meanwhile the learning rate of the BP algorithm has also been run-rate adjusted to accelerate convergence. The simulation results show that the proposed control method can effectively improve the leveling precision of automatic leveling system and shorten the time of leveling.
Precision aerial application for site-specific rice crop management
USDA-ARS?s Scientific Manuscript database
Precision agriculture includes different technologies that allow agricultural professional to use information management tools to optimize agriculture production. The new technologies allow aerial application applicators to improve application accuracy and efficiency, which saves time and money for...
Five critical elements to ensure the precision medicine.
Chen, Chengshui; He, Mingyan; Zhu, Yichun; Shi, Lin; Wang, Xiangdong
2015-06-01
The precision medicine as a new emerging area and therapeutic strategy has occurred and was practiced in the individual and brought unexpected successes, and gained high attentions from professional and social aspects as a new path to improve the treatment and prognosis of patients. There will be a number of new components to appear or be discovered, of which clinical bioinformatics integrates clinical phenotypes and informatics with bioinformatics, computational science, mathematics, and systems biology. In addition to those tools, precision medicine calls more accurate and repeatable methodologies for the identification and validation of gene discovery. Precision medicine will bring more new therapeutic strategies, drug discovery and development, and gene-oriented treatment. There is an urgent need to identify and validate disease-specific, mechanism-based, or epigenetics-dependent biomarkers to monitor precision medicine, and develop "precision" regulations to guard the application of precision medicine.
Murat, Sema; Gurbuz, Ayhan; Isayev, Abulfaz; Dokmez, Bahadir; Cetin, Unsun
2012-01-01
The majority of maxillary defects can be rehabilitated with conventional simple obturator prosthesis. However, inadequate retention, stability and support may be associated with the use of an obturator. Precision attachments have been used to retain obturators for some time. The use of precision attachments in a dentate maxillectomy patient can yield significant functional improvement while maintaining the obturator’s aesthetic advantages. This clinical report describes the prosthetic rehabilitation of two maxillary defects with an obturator retained using extracoronal resilient precision attachments. PMID:22509126
NASA Astrophysics Data System (ADS)
Martín Furones, Angel; Anquela Julián, Ana Belén; Dimas-Pages, Alejandro; Cos-Gayón, Fernando
2017-08-01
Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%.
Optical Telescope System-Level Design Considerations for a Space-Based Gravitational Wave Mission
NASA Technical Reports Server (NTRS)
Livas, Jeffrey C.; Sankar, Shannon R.
2016-01-01
The study of the Universe through gravitational waves will yield a revolutionary new perspective on the Universe, which has been intensely studied using electromagnetic signals in many wavelength bands. A space-based gravitational wave observatory will enable access to a rich array of astrophysical sources in the measurement band from 0.1 to 100 mHz, and nicely complement observations from ground-based detectors as well as pulsar timing arrays by sampling a different range of compact object masses and astrophysical processes. The observatory measures gravitational radiation by precisely monitoring the tiny change in the proper distance between pairs of freely falling proof masses. These masses are separated by millions of kilometers and, using a laser heterodyne interferometric technique, the change in their proper separation is detected to approx. 10 pm over timescales of 1000 seconds, a fractional precision of better than one part in 10(exp 19). Optical telescopes are essential for the implementation of this precision displacement measurement. In this paper we describe some of the key system level design considerations for the telescope subsystem in a mission context. The reference mission for this purpose is taken to be the enhanced Laser Interferometry Space Antenna mission (eLISA), a strong candidate for the European Space Agency's Cosmic Visions L3 launch opportunity in 2034. We will review the flow-down of observatory level requirements to the telescope subsystem, particularly pertaining to the effects of telescope dimensional stability and scattered light suppression, two performance specifications which are somewhat different from the usual requirements for an image forming telescope.
Elevation data fitting and precision analysis of Google Earth in road survey
NASA Astrophysics Data System (ADS)
Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei
2018-05-01
Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously improve data precision of Google Earth. The error of data in hilly terrain areas meets the requirement of specifications after precision improvement and it can be used in feasibility study stage of road survey and design.
Secondary reconstruction of maxillofacial trauma.
Castro-Núñez, Jaime; Van Sickels, Joseph E
2017-08-01
Craniomaxillofacial trauma is one of the most complex clinical conditions in contemporary maxillofacial surgery. Vital structures and possible functional and esthetic sequelae are important considerations following this type of trauma and intervention. Despite the best efforts of the primary surgery, there are a group of patients that will have poor outcomes requiring secondary reconstruction to restore form and function. The purpose of this study is to review current concepts on secondary reconstruction to the maxillofacial complex. The evaluation of a posttraumatic patient for a secondary reconstruction must include an assessment of the different subunits of the upper face, middle face, and lower face. Virtual surgical planning and surgical guides represent the most important innovations in secondary reconstruction over the past few years. Intraoperative navigational surgery/computed-assisted navigation is used in complex cases. Facial asymmetry can be corrected or significantly improved by segmentation of the computerized tomography dataset and mirroring of the unaffected side by means of virtual surgical planning. Navigational surgery/computed-assisted navigation allows for a more precise surgical correction when secondary reconstruction involves the replacement of extensive anatomical areas. The use of technology can result in custom-made replacements and prebent plates, which are more stable and resistant to fracture because of metal fatigue. Careful perioperative evaluation is the key to positive outcomes of secondary reconstruction after trauma. The advent of technological tools has played a capital role in helping the surgical team perform a given treatment plan in a more precise and predictable manner.
Liu, Jiangang; Wang, Guangyao; Chu, Qingquan; Chen, Fu
2017-07-01
Nitrogen (N) application significantly increases maize yield; however, the unreasonable use of N fertilizer is common in China. The analysis of crop yield gaps can reveal the limiting factors for yield improvement, but there is a lack of practical strategies for narrowing yield gaps of household farms. The objectives of this study were to assess the yield gap of summer maize using an integrative method and to develop strategies for narrowing the maize yield gap through precise N fertilization. The results indicated that there was a significant difference in maize yield among fields, with a low level of variation. Additionally, significant differences in N application rate were observed among fields, with high variability. Based on long-term simulation results, the optimal N application rate was 193 kg ha -1 , with a corresponding maximum attainable yield (AY max ) of 10 318 kg ha -1 . A considerable difference between farmers' yields and AY max was observed. Low agronomic efficiency of applied N fertilizer (AE N ) in farmers' fields was exhibited. The integrative method lays a foundation for exploring the specific factors constraining crop yield gaps at the field scale and for developing strategies for rapid site-specific N management. Optimization strategies to narrow the maize yield gap include increasing N application rates and adjusting the N application schedule. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Assez, Nathalie; Smith, Grégoire; Adriansen, Christophe; Aboukais, Wissam; Wiel, Eric; Goldstein, Patrick
2012-08-01
Acute initial management of patients with acute coronary syndrome (ACS) is based on a precise clinical and electrocardiographic diagnosis. Initial risk stratification in the pre-hospital phase is the key step. The last step, adequate patient routing, is decided based on emergency level and reperfusion strategies, considered right from the pre-hospital phase. The management of a patient with an ACS requires close collaboration between emergency physicians and cardiologists, according to simplified protocols for easier access to catheterisation. The next challenges for the pre-hospital management of ACS are based on: - precise knowledge of new antiplatelet and anticoagulant drugs by the emergency physicians, in order to adjust their prescriptions to the patient profile; - developing co-operation between hospitals, according to regional specificities (geographic considerations and distribution of PCI centres) in order to reduce access time to catheterisation rooms; - organising the healthcare network, where the SAMU has an essential role in coordinating the different medical actors; - regular analysis of the evolution of our professional practices, considering, e.g., the guidelines of the "HAS" (French official healthcare guidelines institute);- integrating pre-hospital medicine in health prevention programmes; - improving our understanding of the population's presentations of coronary artery disease, in order to encourage the patients and their families to call the EMS as soon as possible. The challenge of the emergency physician is to adapt the strategies to the patient's needs.
Regulation of alternative splicing at the single-cell level.
Faigenbloom, Lior; Rubinstein, Nimrod D; Kloog, Yoel; Mayrose, Itay; Pupko, Tal; Stein, Reuven
2015-12-28
Alternative splicing is a key cellular mechanism for generating distinct isoforms, whose relative abundances regulate critical cellular processes. It is therefore essential that inclusion levels of alternative exons be tightly regulated. However, how the precision of inclusion levels among individual cells is governed is poorly understood. Using single-cell gene expression, we show that the precision of inclusion levels of alternative exons is determined by the degree of evolutionary conservation at their flanking intronic regions. Moreover, the inclusion levels of alternative exons, as well as the expression levels of the transcripts harboring them, also contribute to this precision. We further show that alternative exons whose inclusion levels are considerably changed during stem cell differentiation are also subject to this regulation. Our results imply that alternative splicing is coordinately regulated to achieve accuracy in relative isoform abundances and that such accuracy may be important in determining cell fate. © 2015 The Authors. Published under the terms of the CC BY 4.0 license.
Precision Half-life Measurement of 25Al
NASA Astrophysics Data System (ADS)
Long, Jacob; Ahn, Tan; Allen, Jacob; Bardayan, Daniel; Becchetti, Fredrich; Blankstein, Drew; Brodeur, Maxime; Burdette, Daniel; Frentz, Bryce; Hall, Matthew; Kelly, James; Kolata, James; O'Malley, Patrick; Schultz, Bradley; Strauss, Sabrina; Valverde, Adrian; TwinSol Collaboration
2017-09-01
In recent years, precision measurements have led to considerable advances in several areas of physics, including fundamental symmetry. Precise determination of ft values for superallowed mixed transitions between mirror nuclides could provide an avenue to test the theoretical corrections used to extract the Vud matrix element from superallowed pure Fermi transitions. Calculation of the ft value requires the half-life, branching ratio, and Q value. 25Al decay is of particular interest as its half-life is derived from a series of conflicting measurements, and the largest uncertainty on the ft value stems from the half-life uncertainty. The life-time was determined by the β counting of implanted 25Al on a Ta foil that was removed from the beam for counting. The 25Al beam was produced by a transfer reaction and separated by the TwinSol facility of the Nuclear Science Laboratory of the University of Notre Dame. The 25Al results will be presented with preliminary results of more recent half-life measurements. The National Science Foundation.
Some considerations about the use of different sensors, in coordinate measuring of the small parts
NASA Astrophysics Data System (ADS)
Drăgan, L.
2017-05-01
The paper presents some particular aspects associated with measuring of the small-size parts with high precision, manufactured by injection procedures. The coordinate measuring machine (CMM) are very used in process of measuring parts with different shapes, dimensions and materials of the most varied. It is studied by experiments, the influence of hygroscopicity on the geometrical properties of polyamide parts, using different types of measuring sensors. We selected a few pieces- cover type, with precision features dimensions and shape tolerances. To measure them was used some sensors which is equipped CMM ScopeCheck S 400 and equipment for dehumidifying. Starting from the need for high precision measurement of geometric characteristics of the parts obtained by injection of plastic, it has been found that the hygroscopicity has a significant influence. To achieve the purpose were used three types of measuring sensors under different conditions of keeping after manufacture. It was observed that the influence of humidity is significantly reduced if the parts are kept in exikator or vacuum dryer.
Precision irrigation for improving crop water management
USDA-ARS?s Scientific Manuscript database
Precision irrigation is gaining attention by the agricultural industry as a means to optimize water inputs, reduce environmental degradation from runoff or deep percolation and maintain crop yields. This presenation will discuss the mechanical and software framework of the irrigation scheduling sup...
The prospects of pulsar timing with new-generation radio telescopes and the Square Kilometre Array
NASA Astrophysics Data System (ADS)
Stappers, B. W.; Keane, E. F.; Kramer, M.; Possenti, A.; Stairs, I. H.
2018-05-01
Pulsars are highly magnetized and rapidly rotating neutron stars. As they spin, the lighthouse-like beam of radio emission from their magnetic poles sweeps across the Earth with a regularity approaching that of the most precise clocks known. This precision combined with the extreme environments in which they are found, often in compact orbits with other neutron stars and white dwarfs, makes them excellent tools for studying gravity. Present and near-future pulsar surveys, especially those using the new generation of telescopes, will find more extreme binary systems and pulsars that are more precise `clocks'. These telescopes will also greatly improve the precision to which we can measure the arrival times of the pulses. The Square Kilometre Array will revolutionize pulsar searches and timing precision. The increased number of sources will reveal rare sources, including possibly a pulsar-black hole binary, which can provide the most stringent tests of strong-field gravity. The improved timing precision will reveal new phenomena and also allow us to make a detection of gravitational waves in the nanohertz frequency regime. It is here where we expect to see the signature of the binary black holes that are formed as galaxies merge throughout cosmological history. This article is part of a discussion meeting issue `The promises of gravitational-wave astronomy'.
Motor imagery training improves precision of an upper limb movement in patients with hemiparesis.
Grabherr, Luzia; Jola, Corinne; Berra, Gilberto; Theiler, Robert; Mast, Fred W
2015-01-01
In healthy participants, beneficial effects of motor imagery training on movement execution have been shown for precision, strength, and speed. In the clinical context, it is still debated whether motor imagery provides an effective rehabilitation technique in patients with motor deficits. To compare the effectiveness of two different types of movement training: motor imagery vs. motor execution. Twenty-five patients with hemiparesis were assigned to one of two training groups: the imagery or the execution-training group. Both groups completed a baseline test before they received six training sessions, each of which was followed by a test session. Using a novel and precisely quantifiable test, we assessed how accurately patients performed an upper limb movement. Both training groups improved performance over the six test sessions but the improvement was significantly larger in the imagery group. That is, the imagery group was able to perform more precise movements than the execution group after the sixth training session while there was no difference at the beginning of the training. The results provide evidence for the benefit of motor imagery training in patients with hemiparesis and thus suggest the integration of cognitive training in conventional physiotherapy practice.
Improved Slip Casting Of Ceramic Models
NASA Technical Reports Server (NTRS)
Buck, Gregory M.; Vasquez, Peter; Hicks, Lana P.
1994-01-01
Improved technique of investment slip casting developed for making precise ceramic wind-tunnel models. Needed in wind-tunnel experiments to verify predictions of aerothermodynamical computer codes. Ceramic materials used because of their low heat conductivities and ability to survive high temperatures. Present improved slip-casting technique enables casting of highly detailed models from aqueous or nonaqueous solutions. Wet shell molds peeled off models to ensure precise and undamaged details. Used at NASA Langley Research Center to form superconducting ceramic components from nonaqueous slip solutions. Technique has many more applications when ceramic materials developed further for such high-strength/ temperature components as engine parts.
Augmenting endogenous repair of soft tissues with nanofibre scaffolds
Snelling, Sarah; Dakin, Stephanie; Carr, Andrew
2018-01-01
As our ability to engineer nanoscale materials has developed we can now influence endogenous cellular processes with increasing precision. Consequently, the use of biomaterials to induce and guide the repair and regeneration of tissues is a rapidly developing area. This review focuses on soft tissue engineering, it will discuss the types of biomaterial scaffolds available before exploring physical, chemical and biological modifications to synthetic scaffolds. We will consider how these properties, in combination, can provide a precise design process, with the potential to meet the requirements of the injured and diseased soft tissue niche. Finally, we frame our discussions within clinical trial design and the regulatory framework, the consideration of which is fundamental to the successful translation of new biomaterials. PMID:29695606
One-Centimeter Orbits in Near-Real Time: The GPS Experience on OSTM/JASON-2
NASA Technical Reports Server (NTRS)
Haines, Bruce; Armatys, Michael; Bar-Sever, Yoaz; Bertiger, Willy; Desai, Shailen; Dorsey, Angela; Lane, Christopher; Weiss, Jan
2010-01-01
The advances in Precise Orbit Determination (POD) over the past three decades have been driven in large measure by the increasing demands of satellite altimetry missions. Since the launch of Seasat in 1978, both tracking-system technologies and orbit modeling capabilities have evolved considerably. The latest in a series of precise (TOPEX-class) altimeter missions is the Ocean Surface Topography Mission (OSTM, also Jason-2). GPS-based orbit solutions for this mission are accurate to 1-cm (radial RMS) within 3-5 hrs of real time. These GPS-based orbit products provide the basis for a near-real time sea-surface height product that supports increasingly diverse applications of operational oceanography and climate forecasting.
[Galen's "On bones for beginners" translation from the Greek text and discussion].
Sakai, Tatsuo; Ikeda, Reitaro; Sawai, Tadashi
2007-09-01
Galen's article "On bones for beginners" was translated literally from the Greek text (Kühn's edition, vol. 2, pp. 732-778) into Japanese, applying the knowledge of modern anatomy. The previous Latin and English translations were utilized as references for the present translation. The present study has revealed that many of the current basic vocabularies for the bones and junctions were established already in Galen's treatises, but have changed their meanings and usages considerably. It has become also apparent that, for the skull, Galen did not observe individual bones but distinguished them by precise observations on the sutures of the skull in monkeys. The precise understanding of Galenic anatomy provides essential information to understand the origin of current anatomy.
Optical track width measurements below 100 nm using artificial neural networks
NASA Astrophysics Data System (ADS)
Smith, R. J.; See, C. W.; Somekh, M. G.; Yacoot, A.; Choi, E.
2005-12-01
This paper discusses the feasibility of using artificial neural networks (ANNs), together with a high precision scanning optical profiler, to measure very fine track widths that are considerably below the conventional diffraction limit of a conventional optical microscope. The ANN is trained using optical profiles obtained from tracks of known widths, the network is then assessed by applying it to test profiles. The optical profiler is an ultra-stable common path scanning interferometer, which provides extremely precise surface measurements. Preliminary results, obtained with a 0.3 NA objective lens and a laser wavelength of 633 nm, show that the system is capable of measuring a 50 nm track width, with a standard deviation less than 4 nm.
Motion Simulation Analysis of Rail Weld CNC Fine Milling Machine
NASA Astrophysics Data System (ADS)
Mao, Huajie; Shu, Min; Li, Chao; Zhang, Baojun
CNC fine milling machine is a new advanced equipment of rail weld precision machining with high precision, high efficiency, low environmental pollution and other technical advantages. The motion performance of this machine directly affects its machining accuracy and stability, which makes it an important consideration for its design. Based on the design drawings, this article completed 3D modeling of 60mm/kg rail weld CNC fine milling machine by using Solidworks. After that, the geometry was imported into Adams to finish the motion simulation analysis. The displacement, velocity, angular velocity and some other kinematical parameters curves of the main components were obtained in the post-processing and these are the scientific basis for the design and development for this machine.
Entangling measurements for multiparameter estimation with two qubits
NASA Astrophysics Data System (ADS)
Roccia, Emanuele; Gianani, Ilaria; Mancino, Luca; Sbroscia, Marco; Somma, Fabrizia; Genoni, Marco G.; Barbieri, Marco
2018-01-01
Careful tailoring the quantum state of probes offers the capability of investigating matter at unprecedented precisions. Rarely, however, the interaction with the sample is fully encompassed by a single parameter, and the information contained in the probe needs to be partitioned on multiple parameters. There exist, then, practical bounds on the ultimate joint-estimation precision set by the unavailability of a single optimal measurement for all parameters. Here, we discuss how these considerations are modified for two-level quantum probes — qubits — by the use of two copies and entangling measurements. We find that the joint estimation of phase and phase diffusion benefits from such collective measurement, while for multiple phases no enhancement can be observed. We demonstrate this in a proof-of-principle photonics setup.
The role of precision agriculture for improved nutrient management on farms.
Hedley, Carolyn
2015-01-01
Precision agriculture uses proximal and remote sensor surveys to delineate and monitor within-field variations in soil and crop attributes, guiding variable rate control of inputs, so that in-season management can be responsive, e.g. matching strategic nitrogen fertiliser application to site-specific field conditions. It has the potential to improve production and nutrient use efficiency, ensuring that nutrients do not leach from or accumulate in excessive concentrations in parts of the field, which creates environmental problems. The discipline emerged in the 1980s with the advent of affordable geographic positioning systems (GPS), and has further developed with access to an array of affordable soil and crop sensors, improved computer power and software, and equipment with precision application control, e.g. variable rate fertiliser and irrigation systems. Precision agriculture focusses on improving nutrient use efficiency at the appropriate scale requiring (1) appropriate decision support systems (e.g. digital prescription maps), and (2) equipment capable of varying application at these different scales, e.g. the footprint of a one-irrigation sprinkler or a fertiliser top-dressing aircraft. This article reviews the rapid development of this discipline, and uses New Zealand as a case study example, as it is a country where agriculture drives economic growth. Here, the high yield potentials on often young, variable soils provide opportunities for effective financial return from investment in these new technologies. © 2014 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Huang, Kuo-Ting; Chen, Hsi-Chao; Lin, Ssu-Fan; Lin, Ke-Ming; Syue, Hong-Ye
2012-09-01
While tin-doped indium oxide (ITO) has been extensively applied in flexible electronics, the problem of the residual stress has many obstacles to overcome. This study investigated the residual stress of flexible electronics by the double beam shadow moiré interferometer, and focused on the precision improvement with phase shifting interferometry (PSI). According to the out-of-plane displacement equation, the theoretical error depends on the grating pitch and the angle between incident light and CCD. The angle error could be reduced to 0.03% by the angle shift of 10° as a result of the double beam interferometer was a symmetrical system. But the experimental error of the double beam moiré interferometer still reached to 2.2% by the noise of the vibration and interferograms. In order to improve the measurement precision, PSI was introduced to the double shadow moiré interferometer. Wavefront phase was reconstructed by the five interferograms with the Hariharan algorithm. The measurement results of standard cylinder indicating the error could be reduced from 2.2% to less than 1% with PSI. The deformation of flexible electronic could be reconstructed fast and calculated the residual stress with the Stoney correction formula. This shadow moiré interferometer with PSI could improve the precision of residual stress for flexible electronics.
Long, Jean-Alexandre; Daanen, Vincent; Moreau-Gaudry, Alexandre; Troccaz, Jocelyne; Rambeaud, Jean-Jacques; Descotes, Jean-Luc
2007-11-01
The objective of this study was to determine the added value of real-time three-dimensional (4D) ultrasound guidance of prostatic biopsies on a prostate phantom in terms of the precision of guidance and distribution. A prostate phantom was constructed. A real-time 3D ultrasonograph connected to a transrectal 5.9 MHz volumic transducer was used. Fourteen operators performed 336 biopsies with 2D guidance then 4D guidance according to a 12-biopsy protocol. Biopsy tracts were modelled by segmentation in a 3D ultrasound volume. Specific software allowed visualization of biopsy tracts in the reference prostate and evaluated the zone biopsied. A comparative study was performed to determine the added value of 4D guidance compared to 2D guidance by evaluating the precision of entry points and target points. The distribution was evaluated by measuring the volume investigated and by a redundancy ratio of the biopsy points. The precision of the biopsy protocol was significantly improved by 4D guidance (p = 0.037). No increase of the biopsy volume and no improvement of the distribution of biopsies were observed with 4D compared to 2D guidance. The real-time 3D ultrasound-guided prostate biopsy technique on a phantom model appears to improve the precision and reproducibility of a biopsy protocol, but the distribution of biopsies does not appear to be improved.
Design and control of the precise tracking bed based on complex electromechanical design theory
NASA Astrophysics Data System (ADS)
Ren, Changzhi; Liu, Zhao; Wu, Liao; Chen, Ken
2010-05-01
The precise tracking technology is wide used in astronomical instruments, satellite tracking and aeronautic test bed. However, the precise ultra low speed tracking drive system is one high integrated electromechanical system, which one complexly electromechanical design method is adopted to improve the efficiency, reliability and quality of the system during the design and manufacture circle. The precise Tracking Bed is one ultra-exact, ultra-low speed, high precision and huge inertial instrument, which some kind of mechanism and environment of the ultra low speed is different from general technology. This paper explores the design process based on complex electromechanical optimizing design theory, one non-PID with a CMAC forward feedback control method is used in the servo system of the precise tracking bed and some simulation results are discussed.
Saragoza, Philip; White, Stephen G
2016-12-01
Workplace predatory violence has been the focus of increased study over the past 30 years, leading to a more sophisticated understanding of the factors that contribute to it, and important considerations for its assessment and management. Risk assessment professionals involved in workplace violence consultations should be mindful of issues specific to the workplace context and the principles of threat assessment to provide a more precise opinion of risk, to inform and enhance critical decisions regarding the employment status of the individual of concern, security measures, possible treatment options, and other management responses, while being mindful of the employee's certain rights. Copyright © 2016 Elsevier Inc. All rights reserved.
A 10 micron laser heterodyne spectrometer for remote detection of trace gases
NASA Technical Reports Server (NTRS)
Mumma, M. J.; Kostiuk, T.; Buhl, D.
1978-01-01
Infrared heterodyne spectroscopy provides a means of measuring the intensity profiles of individual rotation-vibration spectral lines with high sensitivity. Considerable effort has been expended on optimizing these instruments for remote measurements of gases in planetary atmospheres with the result that present-generation spectrometers are beginning to provide new and startling results on the planets. The fundamental principles of laser heterodyne spectroscopy are discussed. Detailed considerations of the optical design and the electronic design of the spectral-line receiver are given. Representative results obtained with this spectrometer are discussed, including precision frequency measurements of NH3 (nu-2) lines, detection of auroral emission from Jupiter, and measurements of terrestrial O3 and CO2.
Precision Medicine, Cardiovascular Disease and Hunting Elephants.
Joyner, Michael J
2016-01-01
Precision medicine postulates improved prediction, prevention, diagnosis and treatment of disease based on patient specific factors especially DNA sequence (i.e., gene) variants. Ideas related to precision medicine stem from the much anticipated "genetic revolution in medicine" arising seamlessly from the human genome project (HGP). In this essay I deconstruct the concept of precision medicine and raise questions about the validity of the paradigm in general and its application to cardiovascular disease. Thus far precision medicine has underperformed based on the vision promulgated by enthusiasts. While niche successes for precision medicine are likely, the promises of broad based transformation should be viewed with skepticism. Open discussion and debate related to precision medicine are urgently needed to avoid misapplication of resources, hype, iatrogenic interventions, and distraction from established approaches with ongoing utility. Failure to engage in such debate will lead to negative unintended consequences from a revolution that might never come. Copyright © 2016 Elsevier Inc. All rights reserved.
EFICAz2: enzyme function inference by a combined approach enhanced by machine learning.
Arakaki, Adrian K; Huang, Ying; Skolnick, Jeffrey
2009-04-13
We previously developed EFICAz, an enzyme function inference approach that combines predictions from non-completely overlapping component methods. Two of the four components in the original EFICAz are based on the detection of functionally discriminating residues (FDRs). FDRs distinguish between member of an enzyme family that are homofunctional (classified under the EC number of interest) or heterofunctional (annotated with another EC number or lacking enzymatic activity). Each of the two FDR-based components is associated to one of two specific kinds of enzyme families. EFICAz exhibits high precision performance, except when the maximal test to training sequence identity (MTTSI) is lower than 30%. To improve EFICAz's performance in this regime, we: i) increased the number of predictive components and ii) took advantage of consensual information from the different components to make the final EC number assignment. We have developed two new EFICAz components, analogs to the two FDR-based components, where the discrimination between homo and heterofunctional members is based on the evaluation, via Support Vector Machine models, of all the aligned positions between the query sequence and the multiple sequence alignments associated to the enzyme families. Benchmark results indicate that: i) the new SVM-based components outperform their FDR-based counterparts, and ii) both SVM-based and FDR-based components generate unique predictions. We developed classification tree models to optimally combine the results from the six EFICAz components into a final EC number prediction. The new implementation of our approach, EFICAz2, exhibits a highly improved prediction precision at MTTSI < 30% compared to the original EFICAz, with only a slight decrease in prediction recall. A comparative analysis of enzyme function annotation of the human proteome by EFICAz2 and KEGG shows that: i) when both sources make EC number assignments for the same protein sequence, the assignments tend to be consistent and ii) EFICAz2 generates considerably more unique assignments than KEGG. Performance benchmarks and the comparison with KEGG demonstrate that EFICAz2 is a powerful and precise tool for enzyme function annotation, with multiple applications in genome analysis and metabolic pathway reconstruction. The EFICAz2 web service is available at: http://cssb.biology.gatech.edu/skolnick/webservice/EFICAz2/index.html.
A Simple Method to Improve Autonomous GPS Positioning for Tractors
Gomez-Gil, Jaime; Alonso-Garcia, Sergio; Gómez-Gil, Francisco Javier; Stombaugh, Tim
2011-01-01
Error is always present in the GPS guidance of a tractor along a desired trajectory. One way to reduce GPS guidance error is by improving the tractor positioning. The most commonly used ways to do this are either by employing more precise GPS receivers and differential corrections or by employing GPS together with some other local positioning systems such as electronic compasses or Inertial Navigation Systems (INS). However, both are complex and expensive solutions. In contrast, this article presents a simple and low cost method to improve tractor positioning when only a GPS receiver is used as the positioning sensor. The method is based on placing the GPS receiver ahead of the tractor, and on applying kinematic laws of tractor movement, or a geometric approximation, to obtain the midpoint position and orientation of the tractor rear axle more precisely. This precision improvement is produced by the fusion of the GPS data with tractor kinematic control laws. Our results reveal that the proposed method effectively reduces the guidance GPS error along a straight trajectory. PMID:22163917
De Andrés, Fernando; Terán, Santiago; Hernández, Francisco; Terán, Enrique; LLerena, Adrián
2016-12-01
Genetic variations within the cytochrome P450 (CYP450) superfamily of drug metabolizing enzymes confer substantial person-to-person and between-population differences in pharmacokinetics, and by extension, highly variable clinical effects of medicines. In this context, "personalized medicine," "precision medicine," and "stratified medicine" are related concepts attributed to what is essentially targeted therapeutics and companion diagnostics, aimed at improving safety and effectiveness of health interventions. We report here, to the best of our knowledge, the first comparative clinical pharmacogenomics study, in an Ecuadorian population sample, of five key CYP450s involved in drug metabolism: CYP1A2, CYP2C9, CYP2C19, CYP2D6, and CYP3A4. In 139 unrelated, medication-free, and healthy Ecuadorian subjects, we measured the phenotypic activity of these drug metabolism pathways using the CEIBA multiplexed phenotyping cocktail. The subjects were genotyped for each CYP450 enzyme gene as well. Notably, based on the CYP450 metabolic phenotypes estimated by the genotype data, 0.75% and 3.10% of the subjects were genotypic poor metabolizers (gPMs) for CYP2C19 and CYP2D6, respectively. Additionally, on the other extreme, genotype-estimated ultrarapid metabolizer (gUMs) phenotype was represented by 15.79% of CYP2C19, and 5.43% of CYP2D6. There was, however, considerable discordance between directly measured phenotypes (mPMs and mUMs) and the above genotype-estimated enzyme phenotypes. For example, among individuals genotypically carrying enhanced activity alleles (gUMs), many showed a lower actual drug metabolism capacity than expected by their genotypes, even lower than individuals with reduced or no activity alleles. In conclusion, for personalized medicine in the Ecuadorian population, we recommend CYP450 multiplexed phenotyping, or genotyping and phenotyping in tandem, rather than CYP450 genotypic tests alone. Additionally, we recommend, in consideration of equity, ethical, and inclusive representation in global science, further precision medicine research and funding in support of neglected or understudied populations worldwide.
Research on the impact factors of GRACE precise orbit determination by dynamic method
NASA Astrophysics Data System (ADS)
Guo, Nan-nan; Zhou, Xu-hua; Li, Kai; Wu, Bin
2018-07-01
With the successful use of GPS-only-based POD (precise orbit determination), more and more satellites carry onboard GPS receivers to support their orbit accuracy requirements. It provides continuous GPS observations in high precision, and becomes an indispensable way to obtain the orbit of LEO satellites. Precise orbit determination of LEO satellites plays an important role for the application of LEO satellites. Numerous factors should be considered in the POD processing. In this paper, several factors that impact precise orbit determination are analyzed, namely the satellite altitude, the time-variable earth's gravity field, the GPS satellite clock error and accelerometer observation. The GRACE satellites provide ideal platform to study the performance of factors for precise orbit determination using zero-difference GPS data. These factors are quantitatively analyzed on affecting the accuracy of dynamic orbit using GRACE observations from 2005 to 2011 by SHORDE software. The study indicates that: (1) with the altitude of the GRACE satellite is lowered from 480 km to 460 km in seven years, the 3D (three-dimension) position accuracy of GRACE satellite orbit is about 3˜4 cm based on long spans data; (2) the accelerometer data improves the 3D position accuracy of GRACE in about 1 cm; (3) the accuracy of zero-difference dynamic orbit is about 6 cm with the GPS satellite clock error products in 5 min sampling interval and can be raised to 4 cm, if the GPS satellite clock error products with 30 s sampling interval can be adopted. (4) the time-variable part of earth gravity field model improves the 3D position accuracy of GRACE in about 0.5˜1.5 cm. Based on this study, we quantitatively analyze the factors that affect precise orbit determination of LEO satellites. This study plays an important role to improve the accuracy of LEO satellites orbit determination.
Improving BeiDou precise orbit determination using observations of onboard MEO satellite receivers
NASA Astrophysics Data System (ADS)
Ge, Haibo; Li, Bofeng; Ge, Maorong; Shen, Yunzhong; Schuh, Harald
2017-12-01
In recent years, the precise orbit determination (POD) of the regional Chinese BeiDou Navigation Satellite System (BDS) has been a hot spot because of its special constellation consisting of five geostationary earth orbit (GEO) satellites and five inclined geosynchronous satellite orbit (IGSO) satellites besides four medium earth orbit (MEO) satellites since the end of 2012. GEO and IGSO satellites play an important role in regional BDS applications. However, this brings a great challenge to the POD, especially for the GEO satellites due to their geostationary orbiting. Though a number of studies have been carried out to improve the POD performance of GEO satellites, the result is still much worse than that of IGSO and MEO, particularly in the along-track direction. The major reason is that the geostationary characteristic of a GEO satellite results in a bad geometry with respect to the ground tracking network. In order to improve the tracking geometry of the GEO satellites, a possible strategy is to mount global navigation satellite system (GNSS) receivers on MEO satellites to collect the signals from GEO/IGSO GNSS satellites so as that these observations can be used to improve GEO/IGSO POD. We extended our POD software package to simulate all the related observations and to assimilate the MEO-onboard GNSS observations in orbit determination. Based on GPS and BDS constellations, simulated studies are undertaken for various tracking scenarios. The impact of the onboard GNSS observations is investigated carefully and presented in detail. The results show that MEO-onboard observations can significantly improve the orbit precision of GEO satellites from metres to decimetres, especially in the along-track direction. The POD results of IGSO satellites also benefit from the MEO-onboard data and the precision can be improved by more than 50% in 3D direction.
A study of microwave downcoverters operating in the K sub u band
NASA Technical Reports Server (NTRS)
Fellers, R. G.; Simpson, T. L.; Tseng, B.
1982-01-01
A computer program for parametric amplifier design is developed with special emphasis on practical design considerations for microwave integrated circuit degenerate amplifiers. Precision measurement techniques are developed to obtain a more realistic varactor equivalent circuit. The existing theory of a parametric amplifier is modified to include the equivalent circuit, and microwave properties, such as loss characteristics and circuit discontinuities are investigated.
Measurement of the Tidal Dissipation in Multiple Stars
NASA Astrophysics Data System (ADS)
Tokovinin, Andrei
2007-08-01
Considerable effort has been spent to date in measuring the period of tidal circularisation in close binaries as a function of age, in order to constrain the tidal dissipation theory. Here we evaluate a new, direct method of measuring the tidal dissipation by precise timings of periastron passages in a very eccentric binary. The example of the 41 Dra system is studied in some detail.
[Diabolus, disease, and Nietzsche: considerations and interpretations].
Barański, J
2001-01-01
Disease shows its dual nature: it carries suffering and pain, but also means trespassing the sluggish limits of healthy life. This trespassing confirms the distance separating man from nature and, at the same time, confirms man's spirituality. Nietzsche's disease and work support this interpretation, so precisely expressed by Thomas Mann in his novel Doctor Faustus: disease contains the diabolus of human biology and human spirituality.
Quasi-Monochromatic Visual Environments and the Resting Point of Accommodation
1988-01-01
accommodation. No statistically significant differences were revealed to support the possibility of color mediated differential regression to resting...discussed with respect to the general findings of the total sample as well as the specific behavior of individual participants. The summarized statistics ...remaining ten varied considerably with respect to the averaged trends reported in the above descriptive statistics as well as with respect to precision
Imaging of Stellar Surfaces with the Navy Precision Optical Interferometer
2015-09-18
geostationary satel- lite with the Navy Prototype Optical Interferome- ter,” in Proc. Optical and Infrared Interferometry II, W. C. Danchi, F...Cormier, “Imag- ing of geostationary satellites with the MRO inter- ferometer,” in Proc. Advanced Maui Optical and Space Surveillance Technologies... geostationary satellites: Signal-to-noise considerations,” in Proc. Advanced Maui Optical and Space Surveillance Technologies Conference, 2011. 6. D
Monte Carlo isotopic inventory analysis for complex nuclear systems
NASA Astrophysics Data System (ADS)
Phruksarojanakun, Phiphat
Monte Carlo Inventory Simulation Engine (MCise) is a newly developed method for calculating isotopic inventory of materials. It offers the promise of modeling materials with complex processes and irradiation histories, which pose challenges for current, deterministic tools, and has strong analogies to Monte Carlo (MC) neutral particle transport. The analog method, including considerations for simple, complex and loop flows, is fully developed. In addition, six variance reduction tools provide unique capabilities of MCise to improve statistical precision of MC simulations. Forced Reaction forces an atom to undergo a desired number of reactions in a given irradiation environment. Biased Reaction Branching primarily focuses on improving statistical results of the isotopes that are produced from rare reaction pathways. Biased Source Sampling aims at increasing frequencies of sampling rare initial isotopes as the starting particles. Reaction Path Splitting increases the population by splitting the atom at each reaction point, creating one new atom for each decay or transmutation product. Delta Tracking is recommended for high-frequency pulsing to reduce the computing time. Lastly, Weight Window is introduced as a strategy to decrease large deviations of weight due to the uses of variance reduction techniques. A figure of merit is necessary to compare the efficiency of different variance reduction techniques. A number of possibilities for figure of merit are explored, two of which are robust and subsequently used. One is based on the relative error of a known target isotope (1/R 2T) and the other on the overall detection limit corrected by the relative error (1/DkR 2T). An automated Adaptive Variance-reduction Adjustment (AVA) tool is developed to iteratively define parameters for some variance reduction techniques in a problem with a target isotope. Sample problems demonstrate that AVA improves both precision and accuracy of a target result in an efficient manner. Potential applications of MCise include molten salt fueled reactors and liquid breeders in fusion blankets. As an example, the inventory analysis of a liquid actinide fuel in the In-Zinerator, a sub-critical power reactor driven by a fusion source, is examined. The result reassures MCise as a reliable tool for inventory analysis of complex nuclear systems.
Precision Measurement of Distribution of Film Thickness on Pendulum for Experiment of G
NASA Astrophysics Data System (ADS)
Liu, Lin-Xia; Guan, Sheng-Guo; Liu, Qi; Zhang, Ya-Ting; Shao, Cheng-Gang; Luo, Jun
2009-09-01
Distribution of film thickness coated on the pendulum of measuring the Newton gravitational constant G is determined with a weighing method by means of a precision mass comparator. The experimental result shows that the gold film on the pendulum will contribute a correction of -24.3 ppm to our G measurement with an uncertainty of 4.3 ppm, which is significant for improving the G value with high precision.
Poor drug distribution as a possible explanation for the results of the PRECISE trial.
Sampson, John H; Archer, Gary; Pedain, Christoph; Wembacher-Schröder, Eva; Westphal, Manfred; Kunwar, Sandeep; Vogelbaum, Michael A; Coan, April; Herndon, James E; Raghavan, Raghu; Brady, Martin L; Reardon, David A; Friedman, Allan H; Friedman, Henry S; Rodríguez-Ponce, M Inmaculada; Chang, Susan M; Mittermeyer, Stephan; Croteau, David; Puri, Raj K
2010-08-01
Convection-enhanced delivery (CED) is a novel intracerebral drug delivery technique with considerable promise for delivering therapeutic agents throughout the CNS. Despite this promise, Phase III clinical trials employing CED have failed to meet clinical end points. Although this may be due to inactive agents or a failure to rigorously validate drug targets, the authors have previously demonstrated that catheter positioning plays a major role in drug distribution using this technique. The purpose of the present work was to retrospectively analyze the expected drug distribution based on catheter positioning data available from the CED arm of the PRECISE trial. Data on catheter positioning from all patients randomized to the CED arm of the PRECISE trial were available for analyses. BrainLAB iPlan Flow software was used to estimate the expected drug distribution. Only 49.8% of catheters met all positioning criteria. Still, catheter positioning score (hazard ratio 0.93, p = 0.043) and the number of optimally positioned catheters (hazard ratio 0.72, p = 0.038) had a significant effect on progression-free survival. Estimated coverage of relevant target volumes was low, however, with only 20.1% of the 2-cm penumbra surrounding the resection cavity covered on average. Although tumor location and resection cavity volume had no effect on coverage volume, estimations of drug delivery to relevant target volumes did correlate well with catheter score (p < 0.003), and optimally positioned catheters had larger coverage volumes (p < 0.002). Only overall survival (p = 0.006) was higher for investigators considered experienced after adjusting for patient age and Karnofsky Performance Scale score. The potential efficacy of drugs delivered by CED may be severely constrained by ineffective delivery in many patients. Routine use of software algorithms and alternative catheter designs and infusion parameters may improve the efficacy of drugs delivered by CED.
Thruster Limitation Consideration for Formation Flight Control
NASA Technical Reports Server (NTRS)
Xu, Yunjun; Fitz-Coy, Norman; Mason, Paul
2003-01-01
Physical constraints of any real system can have a drastic effect on its performance. Some of the more recognized constraints are actuator and sensor saturation and bandwidth, power consumption, sampling rate (sensor and control-loop) and computation limits. These constraints can degrade system s performance, such as settling time, overshoot, rising time, and stability margins. In order to address these issues, researchers have investigated the use of robust and nonlinear controllers that can incorporate uncertainty and constraints into a controller design. For instance, uncertainties can be addressed in the synthesis model used in such algorithms as H(sub infinity), or mu. There is a significant amount of literature addressing this type of problem. However, there is one constraint that has not often been considered; that is, actuator authority resolution. In this work, thruster resolution and controller schemes to compensate for this effect are investigated for position and attitude control of a Low Earth Orbit formation flight system In many academic problems, actuators are assumed to have infinite resolution. In real system applications, such as formation flight systems, the system actuators will not have infinite resolution. High-precision formation flying requires the relative position and the relative attitude to be controlled on the order of millimeters and arc-seconds, respectively. Therefore, the minimum force resolution is a significant concern in this application. Without the sufficient actuator resolution, the system may be unable to attain the required pointing and position precision control. Furthermore, fuel may be wasted due to high-frequency chattering phenomena when attempting to provide a fine control with inadequate actuators. To address this issue, a Sliding Mode Controller is developed along with the boundary Layer Control to provide the best control resolution constraints. A Genetic algorithm is used to optimize the controller parameters according to the states error and fuel consumption criterion. The tradeoffs and effects of the minimum force limitation on performance are studied and compared to the case without the limitation. Furthermore, two methods are proposed to reduce chattering and improve precision.
Precision manufacturing for clinical-quality regenerative medicines.
Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard
2012-08-28
Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.
The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling
NASA Astrophysics Data System (ADS)
Thornes, Tobias; Duben, Peter; Palmer, Tim
2016-04-01
At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new paradigm would represent a revolution in numerical modelling that could be of great benefit to the world.
NASA Astrophysics Data System (ADS)
Gu, Defeng; Ju, Bing; Liu, Junhong; Tu, Jia
2017-09-01
Precise relative position determination is a prerequisite for radar interferometry by formation flying satellites. It has been shown that this can be achieved by high-quality, dual-frequency GPS receivers that provide precise carrier-phase observations. The precise baseline determination between satellites flying in formation can significantly improve the accuracy of interferometric products, and has become a research interest. The key technologies of baseline determination using spaceborne dual-frequency GPS for gravity recovery and climate experiment (GRACE) formation are presented, including zero-difference (ZD) reduced dynamic orbit determination, double-difference (DD) reduced dynamic relative orbit determination, integer ambiguity resolution and relative receiver antenna phase center variation (PCV) estimation. We propose an independent baseline determination method based on a new strategy of integer ambiguity resolution and correction of relative receiver antenna PCVs, and implement the method in the NUDTTK software package. The algorithms have been tested using flight data over a period of 120 days from GRACE. With the original strategy of integer ambiguity resolution based on Melbourne-Wübbena (M-W) combinations, the average success rate is 85.6%, and the baseline precision is 1.13 mm. With the new strategy of integer ambiguity resolution based on a priori relative orbit, the average success rate and baseline precision are improved by 5.8% and 0.11 mm respectively. A relative ionosphere-free phase pattern estimation result is given in this study, and with correction of relative receiver antenna PCVs, the baseline precision is further significantly improved by 0.34 mm. For ZD reduced dynamic orbit determination, the orbit precision for each GRACE satellite A or B in three dimensions (3D) is about 2.5 cm compared to Jet Propulsion Laboratory (JPL) post science orbits. For DD reduced dynamic relative orbit determination, the final baseline precision for two GRACE satellites formation is 0.68 mm validated by K-Band Ranging (KBR) observations, and average ambiguity success rate of about 91.4% could be achieved.
A historical perspective of VR water management for improved crop production
USDA-ARS?s Scientific Manuscript database
Variable-rate water management, or the combination of precision agriculture technology and irrigation, has been enabled by many of the same technologies as other precision agriculture tools. However, adding variable-rate capability to existing irrigation equipment design, or designing new equipment ...
Improving precision of forage yield trials: A case study
USDA-ARS?s Scientific Manuscript database
Field-based agronomic and genetic research relies heavily on the data generated from field evaluations. Therefore, it is imperative to optimize the precision of yield estimates in cultivar evaluation trials to make reliable selections. Experimental error in yield trials is sensitive to several facto...
Using UAVs to enhance the quality of precision agriculture
USDA-ARS?s Scientific Manuscript database
Recent studies by USDA Agricultural Research Service (ARS) have indicated potential for significant improvement in the quality and application of Precision Agriculture products through the use of very high resolution imagery. An assessment of potential platforms to collect such imagery at an afford...
A Surface-Coupled Optical Trap with 1-bp Precision via Active Stabilization
Okoniewski, Stephen R.; Carter, Ashley R.; Perkins, Thomas T.
2017-01-01
Optical traps can measure bead motions with Å-scale precision. However, using this level of precision to infer 1-bp motion of molecular motors along DNA is difficult, since a variety of noise sources degrade instrumental stability. In this chapter, we detail how to improve instrumental stability by (i) minimizing laser pointing, mode, polarization, and intensity noise using an acousto-optical-modulator mediated feedback loop and (ii) minimizing sample motion relative to the optical trap using a 3-axis piezo-electric-stage mediated feedback loop. These active techniques play a critical role in achieving a surface stability of 1 Å in 3D over tens of seconds and a 1-bp stability and precision in a surface-coupled optical trap over a broad bandwidth (Δf = 0.03–2 Hz) at low force (6 pN). These active stabilization techniques can also aid other biophysical assays that would benefit from improved laser stability and/or Å-scale sample stability, such as atomic force microscopy and super-resolution imaging. PMID:27844426
Testing the standard model by precision measurement of the weak charges of quarks.
Young, R D; Carlini, R D; Thomas, A W; Roche, J
2007-09-21
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low energy. The precision of this new result, combined with earlier atomic parity-violation measurements, places tight constraints on the size of possible contributions from physics beyond the standard model. Consequently, this result improves the lower-bound on the scale of relevant new physics to approximately 1 TeV.
NASA Astrophysics Data System (ADS)
Chen, Ming; Guo, Jiming; Li, Zhicai; Zhang, Peng; Wu, Junli; Song, Weiwei
2017-04-01
BDS precision orbit determination is a key content of the BDS application, but the inadequate ground stations and the poor distribution of the network are the main reasons for the low accuracy of BDS precise orbit determination. In this paper, the BDS precise orbit determination results are obtained by using the IGS MGEX stations and the Chinese national reference stations,the accuracy of orbit determination of GEO, IGSO and MEO is 10.3cm, 2.8cm and 3.2cm, and the radial accuracy is 1.6cm,1.9cm and 1.5cm.The influence of ground reference stations distribution on BDS precise orbit determination is studied. The results show that the Chinese national reference stations contribute significantly to the BDS orbit determination, the overlap precision of GEO/IGSO/MEO satellites were improved by 15.5%, 57.5% and 5.3% respectively after adding the Chinese stations.Finally, the results of ODOP(orbit distribution of precision) and SLR are verified. Key words: BDS precise orbit determination; accuracy assessment;Chinese national reference stations;reference stations distribution;orbit distribution of precision
Precise Point Positioning Based on BDS and GPS Observations
NASA Astrophysics Data System (ADS)
Gao, ZhouZheng; Zhang, Hongping; Shen, Wenbin
2014-05-01
BeiDou Navigation Satellite System (BDS) has obtained the ability applying initial navigation and precise point services for the Asian-Pacific regions at the end of 2012 with the constellation of 5 Geostationary Earth Orbit (GEO), 5 Inclined Geosynchronous Orbit (IGSO) and 4 Medium Earth Orbit (MEO). Till 2020, it will consist with 5 GEO, 3 IGSO and 27 MEO, and apply global navigation service similar to GPS and GLONASS. As we known, GPS precise point positioning (PPP) is a powerful tool for crustal deformation monitoring, GPS meteorology, orbit determination of low earth orbit satellites, high accuracy kinematic positioning et al. However, it accuracy and convergence time are influenced by the quality of pseudo-range observations and the observing geometry between user and Global navigation satellites system (GNSS) satellites. Usually, it takes more than 30 minutes even hours to obtain centimeter level position accuracy for PPP while using GPS dual-frequency observations only. In recent years, many researches have been done to solve this problem. One of the approaches is smooth pseudo-range by carrier-phase observations to improve pseudo-range accuracy. By which can improve PPP initial position accuracy and shorten PPP convergence time. Another sachems is to change position dilution of precision (PDOP) with multi-GNSS observations. Now, BDS has the ability to service whole Asian-Pacific regions, which make it possible to use GPS and BDS for precise positioning. In addition, according to researches on GNSS PDOP distribution, BDS can improve PDOP obviously. Therefore, it necessary to do some researches on PPP performance using both GPS observations and BDS observations, especially in Asian-Pacific regions currently. In this paper, we focus on the influences of BDS to GPS PPP mainly in three terms including BDS PPP accuracy, PDOP improvement and convergence time of PPP based on GPS and BDS observations. Here, the GPS and BDS two-constellation data are collected from BeiDou experimental tracking stations (BETS) built by Wuhan University. And BDS precise orbit and precise clock products are applied by GNSS center, Wuhan University. After an introduction about GPS+BDS PPP mathematical and the error correction modes, we analyze the influence of BDS to GPS PPP carefully with calculating results. The statistics results show that BDS PPP can reach centimeter level and BDS can improve PDOP obviously. Moreover, the convergence time and position stability of GPS+BDS PPP is better than that of GPS PPP.
Labrique, Alain; Blynn, Emily; Ahmed, Saifuddin; Gibson, Dustin; Pariyo, George; Hyder, Adnan A
2017-05-05
In low- and middle-income countries (LMICs), historically, household surveys have been carried out by face-to-face interviews to collect survey data related to risk factors for noncommunicable diseases. The proliferation of mobile phone ownership and the access it provides in these countries offers a new opportunity to remotely conduct surveys with increased efficiency and reduced cost. However, the near-ubiquitous ownership of phones, high population mobility, and low cost require a re-examination of statistical recommendations for mobile phone surveys (MPS), especially when surveys are automated. As with landline surveys, random digit dialing remains the most appropriate approach to develop an ideal survey-sampling frame. Once the survey is complete, poststratification weights are generally applied to reduce estimate bias and to adjust for selectivity due to mobile ownership. Since weights increase design effects and reduce sampling efficiency, we introduce the concept of automated active strata monitoring to improve representativeness of the sample distribution to that of the source population. Although some statistical challenges remain, MPS represent a promising emerging means for population-level data collection in LMICs. ©Alain Labrique, Emily Blynn, Saifuddin Ahmed, Dustin Gibson, George Pariyo, Adnan A Hyder. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 05.05.2017.
An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.
Aven, Terje; Renn, Ortwin
2015-04-01
Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.
Bertaccini, Diego; Vaca, Sebastian; Carapito, Christine; Arsène-Ploetze, Florence; Van Dorsselaer, Alain; Schaeffer-Reiss, Christine
2013-06-07
In silico gene prediction has proven to be prone to errors, especially regarding precise localization of start codons that spread in subsequent biological studies. Therefore, the high throughput characterization of protein N-termini is becoming an emerging challenge in the proteomics and especially proteogenomics fields. The trimethoxyphenyl phosphonium (TMPP) labeling approach (N-TOP) is an efficient N-terminomic approach that allows the characterization of both N-terminal and internal peptides in a single experiment. Due to its permanent positive charge, TMPP labeling strongly affects MS/MS fragmentation resulting in unadapted scoring of TMPP-derivatized peptide spectra by classical search engines. This behavior has led to difficulties in validating TMPP-derivatized peptide identifications with usual score filtering and thus to low/underestimated numbers of identified N-termini. We present herein a new strategy (dN-TOP) that overwhelmed the previous limitation allowing a confident and automated N-terminal peptide validation thanks to a combined labeling with light and heavy TMPP reagents. We show how this double labeling allows increasing the number of validated N-terminal peptides. This strategy represents a considerable improvement to the well-established N-TOP method with an enhanced and accelerated data processing making it now fully compatible with high-throughput proteogenomics studies.
Evaporation Using Planet Cubesats and the PT-JPL Model: A Precision Agriculture Application
NASA Astrophysics Data System (ADS)
Aragon, B.; Houborg, R.; Tu, K. P.; Fisher, J.; McCabe, M.
2017-12-01
With an increasing demand to feed growing populations, coupled with the overexploitation of aquifers that supply water to irrigated agriculture, we require an improved understanding of the availability and use of water resources: particularly in arid and semi-arid environments. Remote sensing techniques can provide detail into farm-scale hydrological systems by computing the crop-water use via estimating the evaporation and transpiration (ET). However, remote sensing driven ET retrievals have often been limited by spatial and temporal scales. The launches of Sentinel-2A/B provide some of the best satellite data platforms for optical imagery, with 10m pixel resolution and a 5-day revisit time. However, even with the considerable improvements that these provide over comparable systems such as Landsat, cloud cover and other atmospheric influences can reduce image availability. CubeSats, such as those from Planet, are relaxing such constraints by offering daily global coverage at 3m spatial resolution. Here we examine the performance of the first ET retrievals derived from Planet data using the Priestly-Taylor Jet Propulsion Lab (PT-JPL) model, adapted to instantaneous measurements. The retrievals were assessed across a range of crop-cover, moisture and meteorological conditions using an eddy covariance flux tower installed over an irrigated farmland in Saudi Arabia.
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2003-12-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2004-01-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Habre, Walid
2018-06-01
This review highlights the requirements for harmonization of training, certification and continuous professional development and discusses the implications for anesthesia management of children in Europe. A large prospective cohort study, Anaesthesia PRactice In Children Observational Trial (APRICOT), revealed a high incidence of perioperative severe critical events and a large variability of anesthesia practice across 33 European countries. Relevantly, quality improvement programs have been implemented in North America, which precisely define the requirements to manage anesthesia care for children. These programs, with the introduction of an incident-reporting system at local and national levels, could contribute to the improvement of anesthesia care for children in Europe. The main factors that likely contributed to the APRICOT study results are discussed with the goal of defining clear requirement guidelines for anesthetizing children. Emphasis is placed on the importance of an incident-reporting system that can be used for both competency-based curriculum for postgraduate training as well as for continuous professional development. Variability in training as well as in available resources, equipment and facilities limit the generalization of some of the APRICOT results. Finally, the impact on case outcome of the total number of pediatric cases attended by the anesthesiologist should be taken into consideration along with the level of expertise of the anesthesiologist for complex pediatric anesthesia cases.
Boeker, Martin; Vach, Werner; Motschall, Edith
2013-10-26
Recent research indicates a high recall in Google Scholar searches for systematic reviews. These reports raised high expectations of Google Scholar as a unified and easy to use search interface. However, studies on the coverage of Google Scholar rarely used the search interface in a realistic approach but instead merely checked for the existence of gold standard references. In addition, the severe limitations of the Google Search interface must be taken into consideration when comparing with professional literature retrieval tools.The objectives of this work are to measure the relative recall and precision of searches with Google Scholar under conditions which are derived from structured search procedures conventional in scientific literature retrieval; and to provide an overview of current advantages and disadvantages of the Google Scholar search interface in scientific literature retrieval. General and MEDLINE-specific search strategies were retrieved from 14 Cochrane systematic reviews. Cochrane systematic review search strategies were translated to Google Scholar search expression as good as possible under consideration of the original search semantics. The references of the included studies from the Cochrane reviews were checked for their inclusion in the result sets of the Google Scholar searches. Relative recall and precision were calculated. We investigated Cochrane reviews with a number of included references between 11 and 70 with a total of 396 references. The Google Scholar searches resulted in sets between 4,320 and 67,800 and a total of 291,190 hits. The relative recall of the Google Scholar searches had a minimum of 76.2% and a maximum of 100% (7 searches). The precision of the Google Scholar searches had a minimum of 0.05% and a maximum of 0.92%. The overall relative recall for all searches was 92.9%, the overall precision was 0.13%. The reported relative recall must be interpreted with care. It is a quality indicator of Google Scholar confined to an experimental setting which is unavailable in systematic retrieval due to the severe limitations of the Google Scholar search interface. Currently, Google Scholar does not provide necessary elements for systematic scientific literature retrieval such as tools for incremental query optimization, export of a large number of references, a visual search builder or a history function. Google Scholar is not ready as a professional searching tool for tasks where structured retrieval methodology is necessary.
UAV-Based Hyperspectral Remote Sensing for Precision Agriculture: Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Angel, Y.; Parkes, S. D.; Turner, D.; Houborg, R.; Lucieer, A.; McCabe, M.
2017-12-01
Modern agricultural production relies on monitoring crop status by observing and measuring variables such as soil condition, plant health, fertilizer and pesticide effect, irrigation and crop yield. Managing all of these factors is a considerable challenge for crop producers. As such, providing integrated technological solutions that enable improved diagnostics of field condition to maximize profits, while minimizing environmental impacts, would be of much interest. Such challenges can be addressed by implementing remote sensing systems such as hyperspectral imaging to produce precise biophysical indicator maps across the various cycles of crop development. Recent progress in unmanned aerial vehicles (UAVs) have advanced traditional satellite-based capabilities, providing a capacity for high-spatial, spectral and temporal response. However, while some hyperspectral sensors have been developed for use onboard UAVs, significant investment is required to develop a system and data processing workflow that retrieves accurately georeferenced mosaics. Here we explore the use of a pushbroom hyperspectral camera that is integrated on-board a multi-rotor UAV system to measure the surface reflectance in 272 distinct spectral bands across a wavelengths range spanning 400-1000 nm, and outline the requirement for sensor calibration, integration onto a stable UAV platform enabling accurate positional data, flight planning, and development of data post-processing workflows for georeferenced mosaics. The provision of high-quality and geo-corrected imagery facilitates the development of metrics of vegetation health that can be used to identify potential problems such as production inefficiencies, diseases and nutrient deficiencies and other data-streams to enable improved crop management. Immense opportunities remain to be exploited in the implementation of UAV-based hyperspectral sensing (and its combination with other imaging systems) to provide a transferable and scalable integrated framework for crop growth monitoring and yield prediction. Here we explore some of the challenges and issues in translating the available technological capacity into a useful and useable image collection and processing flow-path that enables these potential applications to be better realized.
NASA Technical Reports Server (NTRS)
Jung, Tae-Won; Lindholm, Fredrik A.; Neugroschel, Arnost
1987-01-01
An improved measurement system for electrical short-circuit current decay is presented that extends applicability of the method to silicon solar cells having an effective lifetime as low as 1 microsec. The system uses metal/oxide/semiconductor transistors as voltage-controlled switches. Advances in theory developed here increase precision and sensitivity in the determination of the minority-carrier recombination lifetime and recombination velocity. A variation of the method, which exploits measurements made on related back-surface field and back-ohmic contact devices, further improves precision and sensitivity. The improvements are illustrated by application to 15 different silicon solar cells.
NASA Technical Reports Server (NTRS)
Vigue, Y.; Lichten, S. M.; Muellerschoen, R. J.; Blewitt, G.; Heflin, M. B.
1993-01-01
Data collected from a worldwide 1992 experiment were processed at JPL to determine precise orbits for the satellites of the Global Positioning System (GPS). A filtering technique was tested to improve modeling of solar-radiation pressure force parameters for GPS satellites. The new approach improves orbit quality for eclipsing satellites by a factor of two, with typical results in the 25- to 50-cm range. The resultant GPS-based estimates for geocentric coordinates of the tracking sites, which include the three DSN sites, are accurate to 2 to 8 cm, roughly equivalent to 3 to 10 nrad of angular measure.
NASA Astrophysics Data System (ADS)
Profe, Jörn; Ohlendorf, Christian
2017-04-01
XRF-scanning is the state-of-the-art technique for geochemical analyses in marine and lacustrine sedimentology for more than a decade. However, little attention has been paid to data precision and technical limitations so far. Using homogenized, dried and powdered samples (certified geochemical reference standards and samples from a lithologically-contrasting loess-paleosol sequence) minimizes many adverse effects that influence the XRF-signal when analyzing wet sediment cores. This allows the investigation of data precision under ideal conditions and documents a new application of the XRF core-scanner technology at the same time. Reliable interpretations of XRF results require data precision evaluation of single elements as a function of X-ray tube, measurement time, sample compaction and quality of peak fitting. Ten-fold measurement of each sample constitutes data precision. Data precision of XRF measurements theoretically obeys Poisson statistics. Fe and Ca exhibit largest deviations from Poisson statistics. The same elements show the least mean relative standard deviations in the range from 0.5% to 1%. This represents the technical limit of data precision achievable by the installed detector. Measurement times ≥ 30 s reveal mean relative standard deviations below 4% for most elements. The quality of peak fitting is only relevant for elements with overlapping fluorescence lines such as Ba, Ti and Mn or for elements with low concentrations such as Y, for example. Differences in sample compaction are marginal and do not change mean relative standard deviation considerably. Data precision is in the range reported for geochemical reference standards measured by conventional techniques. Therefore, XRF scanning of discrete samples provide a cost- and time-efficient alternative to conventional multi-element analyses. As best trade-off between economical operation and data quality, we recommend a measurement time of 30 s resulting in a total scan time of 30 minutes for 30 samples.
A decade of precision agriculture impacts on grain yield and yield variation
USDA-ARS?s Scientific Manuscript database
Targeting management practices and inputs with precision agriculture has high potential to meet some of the grand challenges of sustainability in the coming century, including simultaneously improving crop yields and reducing environmental impacts. Although the potential is high, few studies have do...
Haslem, Derrick S.; Van Norman, S. Burke; Fulde, Gail; Knighton, Andrew J.; Belnap, Tom; Butler, Allison M.; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P.; Lin, Karen; Stone, Gary R.; Loughmiller, David L.; Mishra, Pravin J.; Srivastava, Rajendu; Ford, James M.; Nadauld, Lincoln D.
2017-01-01
Purpose: The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. Patients and Methods: We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). Results: The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group (P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group (P = .126). Conclusion: These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer. PMID:27601506
Haslem, Derrick S; Van Norman, S Burke; Fulde, Gail; Knighton, Andrew J; Belnap, Tom; Butler, Allison M; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P; Lin, Karen; Stone, Gary R; Loughmiller, David L; Mishra, Pravin J; Srivastava, Rajendu; Ford, James M; Nadauld, Lincoln D
2017-02-01
The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group ( P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group ( P = .126). These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer.
The prospects of pulsar timing with new-generation radio telescopes and the Square Kilometre Array.
Stappers, B W; Keane, E F; Kramer, M; Possenti, A; Stairs, I H
2018-05-28
Pulsars are highly magnetized and rapidly rotating neutron stars. As they spin, the lighthouse-like beam of radio emission from their magnetic poles sweeps across the Earth with a regularity approaching that of the most precise clocks known. This precision combined with the extreme environments in which they are found, often in compact orbits with other neutron stars and white dwarfs, makes them excellent tools for studying gravity. Present and near-future pulsar surveys, especially those using the new generation of telescopes, will find more extreme binary systems and pulsars that are more precise 'clocks'. These telescopes will also greatly improve the precision to which we can measure the arrival times of the pulses. The Square Kilometre Array will revolutionize pulsar searches and timing precision. The increased number of sources will reveal rare sources, including possibly a pulsar-black hole binary, which can provide the most stringent tests of strong-field gravity. The improved timing precision will reveal new phenomena and also allow us to make a detection of gravitational waves in the nanohertz frequency regime. It is here where we expect to see the signature of the binary black holes that are formed as galaxies merge throughout cosmological history.This article is part of a discussion meeting issue 'The promises of gravitational-wave astronomy'. © 2018 The Author(s).
Studies into the averaging problem: Macroscopic gravity and precision cosmology
NASA Astrophysics Data System (ADS)
Wijenayake, Tharake S.
2016-08-01
With the tremendous improvement in the precision of available astrophysical data in the recent past, it becomes increasingly important to examine some of the underlying assumptions behind the standard model of cosmology and take into consideration nonlinear and relativistic corrections which may affect it at percent precision level. Due to its mathematical rigor and fully covariant and exact nature, Zalaletdinov's macroscopic gravity (MG) is arguably one of the most promising frameworks to explore nonlinearities due to inhomogeneities in the real Universe. We study the application of MG to precision cosmology, focusing on developing a self-consistent cosmology model built on the averaging framework that adequately describes the large-scale Universe and can be used to study real data sets. We first implement an algorithmic procedure using computer algebra systems to explore new exact solutions to the MG field equations. After validating the process with an existing isotropic solution, we derive a new homogeneous, anisotropic and exact solution. Next, we use the simplest (and currently only) solvable homogeneous and isotropic model of MG and obtain an observable function for cosmological expansion using some reasonable assumptions on light propagation. We find that the principal modification to the angular diameter distance is through the change in the expansion history. We then linearize the MG field equations and derive a framework that contains large-scale structure, but the small scale inhomogeneities have been smoothed out and encapsulated into an additional cosmological parameter representing the averaging effect. We derive an expression for the evolution of the density contrast and peculiar velocities and integrate them to study the growth rate of large-scale structure. We find that increasing the magnitude of the averaging term leads to enhanced growth at late times. Thus, for the same matter content, the growth rate of large scale structure in the MG model is stronger than that of the standard model. Finally, we constrain the MG model using Cosmic Microwave Background temperature anisotropy data, the distance to supernovae data, the galaxy power spectrum, the weak lensing tomography shear-shear cross-correlations and the baryonic acoustic oscillations. We find that for this model the averaging density parameter is very small and does not cause any significant shift in the other cosmological parameters. However, it can lead to increased errors on some cosmological parameters such as the Hubble constant and the amplitude of the linear matter spectrum at the scale of 8h. {-1}Mpc. Further studiesare needed to explore other solutions and models of MG as well as their effects on precision cosmology.
Sliding mode control of magnetic suspensions for precision pointing and tracking applications
NASA Technical Reports Server (NTRS)
Misovec, Kathleen M.; Flynn, Frederick J.; Johnson, Bruce G.; Hedrick, J. Karl
1991-01-01
A recently developed nonlinear control method, sliding mode control, is examined as a means of advancing the achievable performance of space-based precision pointing and tracking systems that use nonlinear magnetic actuators. Analytic results indicate that sliding mode control improves performance compared to linear control approaches. In order to realize these performance improvements, precise knowledge of the plant is required. Additionally, the interaction of an estimating scheme and the sliding mode controller has not been fully examined in the literature. Estimation schemes were designed for use with this sliding mode controller that do not seriously degrade system performance. The authors designed and built a laboratory testbed to determine the feasibility of utilizing sliding mode control in these types of applications. Using this testbed, experimental verification of the authors' analyses is ongoing.
Measurement of the Michel parameter {rho} in normal muon decay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tu, X.; Amann, J.F.; Bolton, R.D.
1995-07-10
A new measurement of the Michel parameter {rho} in normal muon decay has been performed using the MEGA positron spectrometer. Over 500 million triggers were recorded and the data are currently being analyzed. The previous result has a precision on the value of {rho}{plus_minus}0.0026. The present experiment expects to improve the precision to {plus_minus}0.0008 or better. The improved result will be a precise test of the standard model of electroweak interactions for a purely leptonic process. It also will provide a better constraint on the {ital W}{sub {ital R}}{minus}{ital W}{sub {ital L}} mixing angle in the left-right symmetric models. {copyright}more » {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.« less
D'Agostino, M F; Sanz, J; Martínez-Castro, I; Giuffrè, A M; Sicari, V; Soria, A C
2014-07-01
Statistical analysis has been used for the first time to evaluate the dispersion of quantitative data in the solid-phase microextraction (SPME) followed by gas chromatography-mass spectrometry (GC-MS) analysis of blackberry (Rubus ulmifolius Schott) volatiles with the aim of improving their precision. Experimental and randomly simulated data were compared using different statistical parameters (correlation coefficients, Principal Component Analysis loadings and eigenvalues). Non-random factors were shown to significantly contribute to total dispersion; groups of volatile compounds could be associated with these factors. A significant improvement of precision was achieved when considering percent concentration ratios, rather than percent values, among those blackberry volatiles with a similar dispersion behavior. As novelty over previous references, and to complement this main objective, the presence of non-random dispersion trends in data from simple blackberry model systems was evidenced. Although the influence of the type of matrix on data precision was proved, the possibility of a better understanding of the dispersion patterns in real samples was not possible from model systems. The approach here used was validated for the first time through the multicomponent characterization of Italian blackberries from different harvest years. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lemoine, F. G.; Zelensky, N. P.; Luthcke, S. B.; Rowlands, D. D.; Beckley, B. D.; Klosko, S. M.
2006-01-01
Launched in the summer of 1992, TOPEX/POSEIDON (T/P) was a joint mission between NASA and the Centre National d Etudes Spatiales (CNES), the French Space Agency, to make precise radar altimeter measurements of the ocean surface. After the remarkably successful 13-years of mapping the ocean surface T/P lost its ability to maneuver and was de-commissioned January 2006. T/P revolutionized the study of the Earth s oceans by vastly exceeding pre-launch estimates of surface height accuracy recoverable from radar altimeter measurements. The precision orbit lies at the heart of the altimeter measurement providing the reference frame from which the radar altimeter measurements are made. The expected quality of orbit knowledge had limited the measurement accuracy expectations of past altimeter missions, and still remains a major component in the error budget of all altimeter missions. This paper describes critical improvements made to the T/P orbit time series over the 13-years of precise orbit determination (POD) provided by the GSFC Space Geodesy Laboratory. The POD improvements from the pre-launch T/P expectation of radial orbit accuracy and Mission requirement of 13-cm to an expected accuracy of about 1.5-cm with today s latest orbits will be discussed. The latest orbits with 1.5 cm RMS radial accuracy represent a significant improvement to the 2.0-cm accuracy orbits currently available on the T/P Geophysical Data Record (GDR) altimeter product.
ERIC Educational Resources Information Center
DePasquale, Nicole; Hill-Briggs, Felicia; Darrell, Linda; Boyer, LaPricia Lewis; Ephraim, Patti; Boulware, L. Ebony
2012-01-01
Live kidney transplantation (LKT) is underused by patients with end-stage renal disease. Easily implementable and effective interventions to improve patients' early consideration of LKT are needed. The Talking About Live Kidney Donation (TALK) social worker intervention (SWI) improved consideration and pursuit of LKT among patients with…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoang-Do, Ngoc-Tram; Hoang, Van-Hung; Le, Van-Hoang
2013-05-15
The Feranchuk-Komarov operator method is developed by combining with the Levi-Civita transformation in order to construct analytical solutions of the Schroedinger equation for a two-dimensional exciton in a uniform magnetic field of arbitrary strength. As a result, analytical expressions for the energy of the ground and excited states are obtained with a very high precision of up to four decimal places. Especially, the precision is uniformly stable for the whole range of the magnetic field. This advantage appears due to the consideration of the asymptotic behaviour of the wave-functions in strong magnetic field. The results could be used for variousmore » physical analyses and the method used here could also be applied to other atomic systems.« less
Impact Theory of Mass Extinctions and the Invertebrate Fossil Record
NASA Astrophysics Data System (ADS)
Alvarez, Walter; Kauffman, Erle G.; Surlyk, Finn; Alvarez, Luis W.; Asaro, Frank; Michel, Helen V.
1984-03-01
There is much evidence that the Cretaceous-Tertiary boundary was marked by a massive meteorite impact. Theoretical consideration of the consequences of such an impact predicts sharp extinctions in many groups of animals precisely at the boundary. Paleontological data clearly show gradual declines in diversity over the last 1 to 10 million years in various invertebrate groups. Reexamination of data from careful studies of the best sections shows that, in addition to undergoing the decline, four groups (ammonites, cheilostomate bryozoans, brachiopods, and bivalves) were affected by sudden truncations precisely at the iridium anomaly that marks the boundary. The paleontological record thus bears witness to terminal-Cretaceous extinctions on two time scales: a slow decline unrelated to the impact and a sharp truncation synchronous with and probably caused by the impact.
Curcumin Nanomedicine: A Road to Cancer Therapeutics
Yallapu, Murali M.; Jaggi, Meena; Chauhan, Subhash C.
2013-01-01
Cancer is the second leading cause of death in the United States. Conventional therapies cause widespread systemic toxicity and lead to serious side effects which prohibit their long term use. Additionally, in many circumstances tumor resistance and recurrence is commonly observed. Therefore, there is an urgent need to identify suitable anticancer therapies that are highly precise with minimal side effects. Curcumin is a natural polyphenol molecule derived from the Curcuma longa plant which exhibits anticancer, chemo-preventive, chemo- and radio-sensitization properties. Curcumin’s widespread availability, safety, low cost and multiple cancer fighting functions justify its development as a drug for cancer treatment. However, various basic and clinical studies elucidate curcumin’s limited efficacy due to its low solubility, high rate of metabolism, poor bioavailability and pharmacokinetics. A growing list of nanomedicine(s) using first line therapeutic drugs have been approved or are under consideration by the Food and Drug Administration (FDA) to improve human health. These nanotechnology strategies may help to overcome challenges and ease the translation of curcumin from bench to clinical application. Prominent research is reviewed which shows that advanced drug delivery of curcumin (curcumin nanoformulations or curcumin nanomedicine) is able to leverage therapeutic benefits by improving bioavailability and pharmacokinetics which in turn improves binding, internalization and targeting of tumor(s). Outcomes using these novel drug delivery systems have been discussed in detail. This review also describes the tumor-specific drug delivery system(s) that can be highly effective in destroying tumors. Such new approaches are expected to lead to clinical trials and to improve cancer therapeutics. PMID:23116309
Can APEX Represent In-Field Spatial Variability and Simulate Its Effects On Crop Yields?
USDA-ARS?s Scientific Manuscript database
Precision agriculture, from variable rate nitrogen application to precision irrigation, promises improved management of resources by considering the spatial variability of topography and soil properties. Hydrologic models need to simulate the effects of this variability if they are to inform about t...
An Interoperable Electronic Medical Record-Based Platform for Personalized Predictive Analytics
ERIC Educational Resources Information Center
Abedtash, Hamed
2017-01-01
Precision medicine refers to the delivering of customized treatment to patients based on their individual characteristics, and aims to reduce adverse events, improve diagnostic methods, and enhance the efficacy of therapies. Among efforts to achieve the goals of precision medicine, researchers have used observational data for developing predictive…
Spatial variability effects on precision and power of forage yield estimation
USDA-ARS?s Scientific Manuscript database
Spatial analyses of yield trials are important, as they adjust cultivar means for spatial variation and improve the statistical precision of yield estimation. While the relative efficiency of spatial analysis has been frequently reported in several yield trials, its application on long-term forage y...
Precision capacitor has improved temperature and operational stability
NASA Technical Reports Server (NTRS)
Brookshier, W. K.; Lewis, R. N.
1967-01-01
Vacuum dielectric capacitor is fabricated from materials with very low temperature coefficients of expansion. This precision capacitor in the 1000-2000 picofarad range has a near-zero temperature coefficient of capacitance, eliminates ion chamber action caused by air ionization in the dielectric, and minimizes electromagnetic field charging effects.
Regulatory Aspects of Optical Methods and Exogenous Targets for Cancer Detection.
Tummers, Willemieke S; Warram, Jason M; Tipirneni, Kiranya E; Fengler, John; Jacobs, Paula; Shankar, Lalitha; Henderson, Lori; Ballard, Betsy; Pogue, Brian W; Weichert, Jamey P; Bouvet, Michael; Sorger, Jonathan; Contag, Christopher H; Frangioni, John V; Tweedle, Michael F; Basilion, James P; Gambhir, Sanjiv S; Rosenthal, Eben L
2017-05-01
Considerable advances in cancer-specific optical imaging have improved the precision of tumor resection. In comparison to traditional imaging modalities, this technology is unique in its ability to provide real-time feedback to the operating surgeon. Given the significant clinical implications of optical imaging, there is an urgent need to standardize surgical navigation tools and contrast agents to facilitate swift regulatory approval. Because fluorescence-enhanced surgery requires a combination of both device and drug, each may be developed in conjunction, or separately, which are important considerations in the approval process. This report is the result of a one-day meeting held on May 4, 2016 with officials from the National Cancer Institute, the FDA, members of the American Society of Image-Guided Surgery, and members of the World Molecular Imaging Society, which discussed consensus methods for FDA-directed human testing and approval of investigational optical imaging devices as well as contrast agents for surgical applications. The goal of this workshop was to discuss FDA approval requirements and the expectations for approval of these novel drugs and devices, packaged separately or in combination, within the context of optical surgical navigation. In addition, the workshop acted to provide clarity to the research community on data collection and trial design. Reported here are the specific discussion items and recommendations from this critical and timely meeting. Cancer Res; 77(9); 2197-206. ©2017 AACR . ©2017 American Association for Cancer Research.
Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J.; Swenson, Rodney; Price, Catherine C.; Lamar, Melissa; Penney, Dana L.
2015-01-01
The Clock Drawing Test – a simple pencil and paper test – has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer’s disease, Parkinson’s disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject’s performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice. PMID:27057085
NASA Technical Reports Server (NTRS)
Dargush, G. F.; Banerjee, P. K.; Shi, Y.
1992-01-01
As part of the continuing effort at NASA LeRC to improve both the durability and reliability of hot section Earth-to-orbit engine components, significant enhancements must be made in existing finite element and finite difference methods, and advanced techniques, such as the boundary element method (BEM), must be explored. The BEM was chosen as the basic analysis tool because the critical variables (temperature, flux, displacement, and traction) can be very precisely determined with a boundary-based discretization scheme. Additionally, model preparation is considerably simplified compared to the more familiar domain-based methods. Furthermore, the hyperbolic character of high speed flow is captured through the use of an analytical fundamental solution, eliminating the dependence of the solution on the discretization pattern. The price that must be paid in order to realize these advantages is that any BEM formulation requires a considerable amount of analytical work, which is typically absent in the other numerical methods. All of the research accomplishments of a multi-year program aimed toward the development of a boundary element formulation for the study of hot fluid-structure interaction in Earth-to-orbit engine hot section components are detailed. Most of the effort was directed toward the examination of fluid flow, since BEM's for fluids are at a much less developed state. However, significant strides were made, not only in the analysis of thermoviscous fluids, but also in the solution of the fluid-structure interaction problem.
Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J; Swenson, Rodney; Price, Catherine C; Lamar, Melissa; Penney, Dana L
2016-03-01
The Clock Drawing Test - a simple pencil and paper test - has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer's disease, Parkinson's disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject's performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice.
High-Precision Distribution of Highly Stable Optical Pulse Trains with 8.8 × 10−19 instability
Ning, B.; Zhang, S. Y.; Hou, D.; Wu, J. T.; Li, Z. B.; Zhao, J. Y.
2014-01-01
The high-precision distribution of optical pulse trains via fibre links has had a considerable impact in many fields. In most published work, the accuracy is still fundamentally limited by unavoidable noise sources, such as thermal and shot noise from conventional photodiodes and thermal noise from mixers. Here, we demonstrate a new high-precision timing distribution system that uses a highly precise phase detector to obviously reduce the effect of these limitations. Instead of using photodiodes and microwave mixers, we use several fibre Sagnac-loop-based optical-microwave phase detectors (OM-PDs) to achieve optical-electrical conversion and phase measurements, thereby suppressing the sources of noise and achieving ultra-high accuracy. The results of a distribution experiment using a 10-km fibre link indicate that our system exhibits a residual instability of 2.0 × 10−15 at1 s and8.8 × 10−19 at 40,000 s and an integrated timing jitter as low as 3.8 fs in a bandwidth of 1 Hz to 100 kHz. This low instability and timing jitter make it possible for our system to be used in the distribution of optical-clock signals or in applications that require extremely accurate frequency/time synchronisation. PMID:24870442
NASA Technical Reports Server (NTRS)
Davis, D. W.; Corfu, F.; Krogh, T. E.
1986-01-01
The underlying mechanisms of Archean tectonics and the degree to which modern plate tectonic models are applicable early in Earth's history continue to be a subject of considerable debate. A precise knowledge of the timing of geological events is of the utmost importance in studying this problem. The high precision U-Pb method has been applied in recent years to rock units in many areas of the Superior Province. Most of these data have precisions of about + or - 2-3 Ma. The resulting detailed chronologies of local igneous development and the regional age relationships furnish tight constraints on any Archean tectonic model. Superior province terrains can be classified into 3 types: (1) low grade areas dominated by meta-volcanic rocks (greenstone belts); (2) high grade, largely metaplutonic areas with abundant orthogneiss and foliated to massive I-type granitoid bodies; and (3) high grade areas with abundant metasediments, paragneiss and S-type plutons. Most of the U-Pb age determinations have been done on type 1 terrains with very few having been done in type 3 terrains. A compilation of over 120 ages indicates that the major part of igneous activity took place in the period 2760-2670 Ma, known as the Kenoran event. This event was ubiquitous throughout the Superior Province.
Accuracy or precision: Implications of sample design and methodology on abundance estimation
Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.
2015-01-01
Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.
Precision medicine driven by cancer systems biology.
Filipp, Fabian V
2017-03-01
Molecular insights from genome and systems biology are influencing how cancer is diagnosed and treated. We critically evaluate big data challenges in precision medicine. The melanoma research community has identified distinct subtypes involving chronic sun-induced damage and the mitogen-activated protein kinase driver pathway. In addition, despite low mutation burden, non-genomic mitogen-activated protein kinase melanoma drivers are found in membrane receptors, metabolism, or epigenetic signaling with the ability to bypass central mitogen-activated protein kinase molecules and activating a similar program of mitogenic effectors. Mutation hotspots, structural modeling, UV signature, and genomic as well as non-genomic mechanisms of disease initiation and progression are taken into consideration to identify resistance mutations and novel drug targets. A comprehensive precision medicine profile of a malignant melanoma patient illustrates future rational drug targeting strategies. Network analysis emphasizes an important role of epigenetic and metabolic master regulators in oncogenesis. Co-occurrence of driver mutations in signaling, metabolic, and epigenetic factors highlights how cumulative alterations of our genomes and epigenomes progressively lead to uncontrolled cell proliferation. Precision insights have the ability to identify independent molecular pathways suitable for drug targeting. Synergistic treatment combinations of orthogonal modalities including immunotherapy, mitogen-activated protein kinase inhibitors, epigenetic inhibitors, and metabolic inhibitors have the potential to overcome immune evasion, side effects, and drug resistance.
Motion Control and Coupled Oscillators
1995-01-01
eciency considerations have also been of interest (see [7] for related discussion). Apparently, c.f. Figure 2, the paramecium gets around in a uid...the paramecium , in other contexts of animal movement, dynamic in uences play an important part (e.g. in walking, trotting and galloping gaits of...precisely 2 Figure 2. Paramecium this mechanism, variously associated with geometric phases, area rules, and Lie bracket generation that has had a crucial
Seismic Motion Stability, Measurement and Precision Control.
1979-12-01
tiltmeter . Tilt was corrected by changing air pressure in one bank of isolators to maintain the reference tiltmeter at null well within the 0.1 arcsecond...frequency rotations (0-0.1 Hz), a high quality, two-axis tiltmeter is used. The azimuth orientation angle could be measured with a four-position gyro...compassing system with considerably less accuracy than the tiltmeters . However, it would provide a continuous automatic azimuth determination update every
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…
Unmanned Surface Combatant Considerations for Concept Exploration
2011-06-01
counterparts who become fatigued. The USC systems could perform missions without having to consider the health and morale of the crew. 18 Arguably the...fourth objective was to evaluate the USC relevant technologies and identify potential design issues . The result was relevant technologies that...and issue mines. R MIW 8 (U) Conduct precise navigation. G MIW 9 (U) Conduct airborne mine countermeasures. R MIW 10 (U) Provide for air operations
2013-09-01
637 2. S.V. Prasad and R. Asthana, "Aluminum Metal-Matrix Composites for Automotive Applications : Tribological Considerations," Tribology Leiters, 11...seeing widespread use in thermal management, precision equipment, and automotive applications where composition and microstructure are tailored to...Key applications include high specific stiffuess panels and beams, fluid flow structures, thermal management substrates, and blast wave mitigation
[Rational approach of leucocytosis in adults].
Bron, D
2013-09-01
Several factors may influence the increased level of white blood cells (physical and/or emotional stimuli, infections, inflammations, drugs, toxins, etc.). The best approach to identify these disorders gets through a physical examination and a precise history. A particular attention should be focused on the onset of lymph nodes and/or splenomegaly. Among severe haematological disorders to take into consideration, we should exclude acute leukemias, chronic leukemias and myeloproliferative disorders.
NASA Astrophysics Data System (ADS)
Various papers on navigation satellites are presented. The general topics considered include: overview and status of GPS, kinematic positioning, international developments and perspective on satellite positioning, test range applications, civil applications, and receiver developments and equipment. Consideration is given to multisensor integration, military applications, differential operation, integrity, propagation phenomena and measurement networks, and precise time and time transfer.
Lamberti, A; Vanlanduit, S; De Pauw, B; Berghmans, F
2014-03-24
Fiber Bragg Gratings (FBGs) can be used as sensors for strain, temperature and pressure measurements. For this purpose, the ability to determine the Bragg peak wavelength with adequate wavelength resolution and accuracy is essential. However, conventional peak detection techniques, such as the maximum detection algorithm, can yield inaccurate and imprecise results, especially when the Signal to Noise Ratio (SNR) and the wavelength resolution are poor. Other techniques, such as the cross-correlation demodulation algorithm are more precise and accurate but require a considerable higher computational effort. To overcome these problems, we developed a novel fast phase correlation (FPC) peak detection algorithm, which computes the wavelength shift in the reflected spectrum of a FBG sensor. This paper analyzes the performance of the FPC algorithm for different values of the SNR and wavelength resolution. Using simulations and experiments, we compared the FPC with the maximum detection and cross-correlation algorithms. The FPC method demonstrated a detection precision and accuracy comparable with those of cross-correlation demodulation and considerably higher than those obtained with the maximum detection technique. Additionally, FPC showed to be about 50 times faster than the cross-correlation. It is therefore a promising tool for future implementation in real-time systems or in embedded hardware intended for FBG sensor interrogation.
High precision localization of intracerebral hemorrhage based on 3D MPR on head CT images
NASA Astrophysics Data System (ADS)
Sun, Jianyong; Hou, Xiaoshuai; Sun, Shujie; Zhang, Jianguo
2017-03-01
The key step for minimally invasive intracerebral hemorrhage surgery is precisely positioning the hematoma location in the brain before and during the hematoma surgery, which can significantly improves the success rate of puncture hematoma. We designed a 3D computerized surgical plan (CSP) workstation precisely to locate brain hematoma based on Multi-Planar Reconstruction (MPR) visualization technique. We used ten patients' CT/MR studies to verify our designed CSP intracerebral hemorrhage localization method. With the doctor's assessment and comparing with the results of manual measurements, the output of CSP WS for hematoma surgery is more precise and reliable than manual procedure.
Achieving High Resolution Timer Events in Virtualized Environment.
Adamczyk, Blazej; Chydzinski, Andrzej
2015-01-01
Virtual Machine Monitors (VMM) have become popular in different application areas. Some applications may require to generate the timer events with high resolution and precision. This however may be challenging due to the complexity of VMMs. In this paper we focus on the timer functionality provided by five different VMMs-Xen, KVM, Qemu, VirtualBox and VMWare. Firstly, we evaluate resolutions and precisions of their timer events. Apparently, provided resolutions and precisions are far too low for some applications (e.g. networking applications with the quality of service). Then, using Xen virtualization we demonstrate the improved timer design that greatly enhances both the resolution and precision of achieved timer events.
Precision Medicine Approaches to Diabetic Kidney Disease: Tissue as an Issue.
Gluck, Caroline; Ko, Yi-An; Susztak, Katalin
2017-05-01
Precision medicine approaches, that tailor medications to specific individuals has made paradigm-shifting improvements for patients with certain cancer types. Such approaches, however, have not been implemented for patients with diabetic kidney disease. Precision medicine could offer new avenues for novel diagnostic, prognostic and targeted therapeutics development. Genetic studies associated with multiscalar omics datasets from tissue and cell types of interest of well-characterized cohorts are needed to change the current paradigm. In this review, we will discuss precision medicine approaches that the nephrology community can take to analyze tissue samples to develop new therapeutics for patients with diabetic kidney disease.
[Precision medicine : a required approach for the general internist].
Waeber, Gérard; Cornuz, Jacques; Gaspoz, Jean-Michel; Guessous, Idris; Mooser, Vincent; Perrier, Arnaud; Simonet, Martine Louis
2017-01-18
The general internist cannot be a passive bystander of the anticipated medical revolution induced by precision medicine. This latter aims to improve the predictive and/or clinical course of an individual by integrating all biological, genetic, environmental, phenotypic and psychosocial knowledge of a person. In this article, national and international initiatives in the field of precision medicine are discussed as well as the potential financial, ethical and limitations of personalized medicine. The question is not to know if precision medicine will be part of everyday life but rather to integrate early the general internist in multidisciplinary teams to ensure optimal information and shared-decision process with patients and individuals.
NASA Astrophysics Data System (ADS)
Qian, Elaine A.; Wixtrom, Alex I.; Axtell, Jonathan C.; Saebi, Azin; Jung, Dahee; Rehak, Pavel; Han, Yanxiao; Moully, Elamar Hakim; Mosallaei, Daniel; Chow, Sylvia; Messina, Marco S.; Wang, Jing Yang; Royappa, A. Timothy; Rheingold, Arnold L.; Maynard, Heather D.; Král, Petr; Spokoyny, Alexander M.
2017-04-01
The majority of biomolecules are intrinsically atomically precise, an important characteristic that enables rational engineering of their recognition and binding properties. However, imparting a similar precision to hybrid nanoparticles has been challenging because of the inherent limitations of existing chemical methods and building blocks. Here we report a new approach to form atomically precise and highly tunable hybrid nanomolecules with well-defined three-dimensionality. Perfunctionalization of atomically precise clusters with pentafluoroaryl-terminated linkers produces size-tunable rigid cluster nanomolecules. These species are amenable to facile modification with a variety of thiol-containing molecules and macromolecules. Assembly proceeds at room temperature within hours under mild conditions, and the resulting nanomolecules exhibit high stabilities because of their full covalency. We further demonstrate how these nanomolecules grafted with saccharides can exhibit dramatically improved binding affinity towards a protein. Ultimately, the developed strategy allows the rapid generation of precise molecular assemblies to investigate multivalent interactions.
Department of Defense Precise Time and Time Interval program improvement plan
NASA Technical Reports Server (NTRS)
Bowser, J. R.
1981-01-01
The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.
Accuracy of active chirp linearization for broadband frequency modulated continuous wave ladar.
Barber, Zeb W; Babbitt, Wm Randall; Kaylor, Brant; Reibel, Randy R; Roos, Peter A
2010-01-10
As the bandwidth and linearity of frequency modulated continuous wave chirp ladar increase, the resulting range resolution, precisions, and accuracy are improved correspondingly. An analysis of a very broadband (several THz) and linear (<1 ppm) chirped ladar system based on active chirp linearization is presented. Residual chirp nonlinearity and material dispersion are analyzed as to their effect on the dynamic range, precision, and accuracy of the system. Measurement precision and accuracy approaching the part per billion level is predicted.
A Study on Improvement of Machining Precision in a Medical Milling Robot
NASA Astrophysics Data System (ADS)
Sugita, Naohiko; Osa, Takayuki; Nakajima, Yoshikazu; Mori, Masahiko; Saraie, Hidenori; Mitsuishi, Mamoru
Minimal invasiveness and increasing of precision have recently become important issues in orthopedic surgery. The femur and tibia must be cut precisely for successful knee arthroplasty. The recent trend towards Minimally Invasive Surgery (MIS) has increased surgical difficulty since the incision length and open access area are small. In this paper, the result of deformation analysis of the robot and an active compensation method of robot deformation, which is based on an error map, are proposed and evaluated.
Pollock, George G.
1997-01-01
Two power supplies are combined to control a furnace. A main power supply heats the furnace in the traditional manner, while the power from the auxiliary supply is introduced as a current flow through charged particles existing due to ionized gas or thermionic emission. The main power supply provides the bulk heating power and the auxiliary supply provides a precise and fast power source such that the precision of the total power delivered to the furnace is improved.
Research on the high-precision non-contact optical detection technology for banknotes
NASA Astrophysics Data System (ADS)
Jin, Xiaofeng; Liang, Tiancai; Luo, Pengfeng; Sun, Jianfeng
2015-09-01
The technology of high-precision laser interferometry was introduced for optical measurement of the banknotes in this paper. Taking advantage of laser short wavelength and high sensitivity, information of adhesive tape and cavity about the banknotes could be checked efficiently. Compared with current measurement devices, including mechanical wheel measurement device, Infrared measurement device, ultrasonic measurement device, the laser interferometry measurement has higher precision and reliability. This will improve the ability of banknotes feature information in financial electronic equipment.
Smith, Alan D; Motley, Darlene
2009-01-01
Technology in healthcare environments has increasingly become a vital way to communicate vital information in a safe, reliable, precise and secure manner. Healthcare is an arena that is constantly changing and very fast paced, but adoption of electronic prescribing (e-prescribing) has been comparatively slow and painful in the USA. Medical professionals need a system to communicate medications and diagnosis, with patients' safety as the major consideration, especially with the many complexities associated with drug-interactions and allergies. Via multivariate analysis and linear regression analysis, it was found that degree of e-prescribing acceptance is highly predictable by constructs of Technological Sophistication, Operational Factors and Maturity Factors, which are very stable ease-of-use variables derived from the TAM Model by Davis (1989).
Genetic testing for inherited ocular disease: delivering on the promise at last?
Gillespie, Rachel L; Hall, Georgina; Black, Graeme C
2014-01-01
Genetic testing is of increasing clinical utility for diagnosing inherited eye disease. Clarifying a clinical diagnosis is important for accurate estimation of prognosis, facilitating genetic counselling and management of families, and in the future will direct gene-specific therapeutic strategies. Often, precise diagnosis of genetic ophthalmic conditions is complicated by genetic heterogeneity, a difficulty that the so-called 'next-generation sequencing' technologies promise to overcome. Despite considerable counselling and ethical complexities, next-generation sequencing offers to revolutionize clinical practice. This will necessitate considerable adjustment to standard practice but has the power to deliver a personalized approach to genomic medicine for many more patients and enhance the potential for preventing vision loss. © 2013 Royal Australian and New Zealand College of Ophthalmologists.
Nudges and coercion: conceptual, empirical, and normative considerations.
Cratsley, Kelso
2015-01-01
Given that the concept of coercion remains a central concern for bioethics, Quigley's (Monash Bioethics Rev 32:141-158, 2014) recent article provides a helpful analysis of its frequent misapplication in debates over the use of 'nudges'. In this commentary I present a generally sympathetic response to Quigley's argument while also raising several issues that are important for the larger debates about nudges and coercion. I focus on several closely related topics, including the definition of coercion, the role of empirical research, and the normative concerns at the core of these disputes. I suggest that while a degree of precision is certainly required when deploying the relevant concepts, perhaps informed by empirical data, we need to continue to push these debates towards more pressing normative considerations.
Ayn, Caitlyn; Robinson, Lynne; Nason, April; Lovas, John
2017-04-01
Professional communication skills have a significant impact on dental patient satisfaction and health outcomes. Communication skills training has been shown to improve the communication skills of dental students. Therefore, strengthening communication skills training in dental education shows promise for improving dental patient satisfaction and outcomes. The aim of this study was to facilitate the development of dental communication skills training through a scoping review with compilation of a list of considerations, design of an example curriculum, and consideration of barriers and facilitators to adoption of such training. A search to identify studies of communication skills training interventions and programs was conducted. Search queries were run in three databases using both text strings and controlled terms (MeSH), yielding 1,833 unique articles. Of these, 35 were full-text reviewed, and 17 were included in the final synthesis. Considerations presented in the articles were compiled into 15 considerations. These considerations were grouped into four themes: the value of communication skills training, the role of instructors, the importance of accounting for diversity, and the structure of communication skills training. An example curriculum reflective of these considerations is presented, and consideration of potential barriers and facilitators to implementation are discussed. Application and evaluation of these considerations are recommended in order to support and inform future communication skills training development.
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J.
2017-01-01
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system’s configuration and LS’s relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS’ localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision. PMID:28125056
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique.
Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J
2017-01-25
The increased potential and effectiveness of Real-time Locating Systems (RTLSs) substantially influence their application spectrum. They are widely used, inter alia, in the industrial sector, healthcare, home care, and in logistic and security applications. The research aims to develop an analytical method to customize UWB-based RTLS, in order to improve their localization performance in terms of accuracy and precision. The analytical uncertainty model of Angle of Arrival (AoA) localization in a 3D indoor space, which is the foundation of the customization concept, is established in a working environment. Additionally, a suitable angular-based 3D localization algorithm is introduced. The paper investigates the following issues: the influence of the proposed correction vector on the localization accuracy; the impact of the system's configuration and LS's relative deployment on the localization precision distribution map. The advantages of the method are verified by comparing them with a reference commercial RTLS localization engine. The results of simulations and physical experiments prove the value of the proposed customization method. The research confirms that the analytical uncertainty model is the valid representation of RTLS' localization uncertainty in terms of accuracy and precision and can be useful for its performance improvement. The research shows, that the Angle of Arrival localization in a 3D indoor space applying the simple angular-based localization algorithm and correction vector improves of localization accuracy and precision in a way that the system challenges the reference hardware advanced localization engine. Moreover, the research guides the deployment of location sensors to enhance the localization precision.
Carotenoids in Staple Cereals: Metabolism, Regulation, and Genetic Manipulation
Zhai, Shengnan; Xia, Xianchun; He, Zhonghu
2016-01-01
Carotenoids play a critical role in animal and human health. Animals and humans are unable to synthesize carotenoids de novo, and therefore rely upon diet as sources of these compounds. However, major staple cereals often contain only small amounts of carotenoids in their grains. Consequently, there is considerable interest in genetic manipulation of carotenoid content in cereal grain. In this review, we focus on carotenoid metabolism and regulation in non-green plant tissues, as well as genetic manipulation in staple cereals such as rice, maize, and wheat. Significant progress has been made in three aspects: (1) seven carotenogenes play vital roles in carotenoid regulation in non-green plant tissues, including 1-deoxyxylulose-5-phosphate synthase influencing isoprenoid precursor supply, phytoene synthase, β-cyclase, and ε-cyclase controlling biosynthesis, 1-hydroxy-2-methyl-2-(E)-butenyl 4-diphosphate reductase and carotenoid cleavage dioxygenases responsible for degradation, and orange gene conditioning sequestration sink; (2) provitamin A-biofortified crops, such as rice and maize, were developed by either metabolic engineering or marker-assisted breeding; (3) quantitative trait loci for carotenoid content on chromosomes 3B, 7A, and 7B were consistently identified, eight carotenogenes including 23 loci were detected, and 10 gene-specific markers for carotenoid accumulation were developed and applied in wheat improvement. A comprehensive and deeper understanding of the regulatory mechanisms of carotenoid metabolism in crops will be beneficial in improving our precision in improving carotenoid contents. Genomic selection and gene editing are emerging as transformative technologies for provitamin A biofortification. PMID:27559339
Three Research Strategies of Neuroscience and the Future of Legal Imaging Evidence.
Jun, Jinkwon; Yoo, Soyoung
2018-01-01
Neuroscientific imaging evidence (NIE) has become an integral part of the criminal justice system in the United States. However, in most legal cases, NIE is submitted and used only to mitigate penalties because the court does not recognize it as substantial evidence, considering its lack of reliability. Nevertheless, we here discuss how neuroscience is expected to improve the use of NIE in the legal system. For this purpose, we classified the efforts of neuroscientists into three research strategies: cognitive subtraction, the data-driven approach, and the brain-manipulation approach. Cognitive subtraction is outdated and problematic; consequently, the court deemed it to be an inadequate approach in terms of legal evidence in 2012. In contrast, the data-driven and brain manipulation approaches, which are state-of-the-art approaches, have overcome the limitations of cognitive subtraction. The data-driven approach brings data science into the field and is benefiting immensely from the development of research platforms that allow automatized collection, analysis, and sharing of data. This broadens the scale of imaging evidence. The brain-manipulation approach uses high-functioning tools that facilitate non-invasive and precise human brain manipulation. These two approaches are expected to have synergistic effects. Neuroscience has strived to improve the evidential reliability of NIE, with considerable success. With the support of cutting-edge technologies, and the progress of these approaches, the evidential status of NIE will be improved and NIE will become an increasingly important part of legal practice.
Pavan, Gabriela; Godoy, Julia Almeida; Monteiro, Ricardo Tavares; Moreschi, Hugo Karling; Nogueira, Eduardo Lopes; Spanemberg, Lucas
2016-01-01
Assessment of the results of treatment for mental disorders becomes more complete when the patient's perspective is incorporated. Here, we aimed to evaluate the psychometric properties and application of the Perceived Change Scale - Patient version (PCS-P) in a sample of inpatients with mental disorders. One hundred and ninety-one psychiatric inpatients answered the PCS-P and the Patients' Satisfaction with Mental Health Services Scale (SATIS) and were evaluated in terms of clinical and sociodemographic data. An exploratory factor analysis (EFA) was performed and internal consistency was calculated. The clinical impressions of the patient, family, and physician were correlated with the patient's perception of change. The EFA indicated a psychometrically suitable four-factor solution. The PCS-P exhibited a coherent relationship with SATIS and had a Cronbach's alpha value of 0.856. No correlations were found between the physician's clinical global impression of improvement and the patient's perception of change, although a moderate positive correlation was found between the patients' clinical global impression of improvement and the change perceived by the patient. The PCS-P exhibited adequate psychometric proprieties in a sample of inpatients with mental disorders. The patient's perception of change is an important dimension for evaluation of outcomes in the treatment of mental disorders and differs from the physician's clinical impression of improvement. Evaluation of positive and negative perceptions of the various dimensions of the patient's life enables more precise consideration of the patient's priorities and interests.
Nowacki, Maciej; Peterson, Margarita; Kloskowski, Tomasz; McCabe, Eleanor; Guiral, Delia Cortes; Polom, Karol; Pietkun, Katarzyna; Zegarska, Barbara; Pokrywczynska, Marta; Drewa, Tomasz; Roviello, Franco; Medina, Edward A.; Habib, Samy L.; Zegarski, Wojciech
2017-01-01
The treatment of peritoneal surface malignances has changed considerably over the last thirty years. Unfortunately, the palliative is the only current treatment for peritoneal carcinomatosis (PC). Two primary intraperitoneal chemotherapeutic methods are used. The first is combination of cytoreductive surgery (CRS) and Hyperthermic IntraPEritoneal Chemotherapy (HIPEC), which has become the gold standard for many cases of PC. The second is Pressurized IntraPeritoneal Aerosol Chemotheprapy (PIPAC), which is promising direction to minimally invasive as safedrug delivery. These methods were improved through multicenter studies and clinical trials that yield important insights and solutions. Major method development has been made through nanomedicine, specifically nanoparticles. Here, we are presenting the latest advances of nanoparticles and their application to precision diagnostics and improved treatment strategies for PC. These advances will likely develop both HIPEC and PIPAC methods that used for in vitro and in vivo studies. Several benefits of using nanoparticles will be discussed including: 1) Nanoparticles as drug delivery systems; 2) Nanoparticles and Near Infrred (NIR) Irradiation; 3) use of nanoparticles in perioperative diagnostic and individualized treatment planning; 4) use of nanoparticles as anticancer dressing’s, hydrogels and as active beeds for optimal reccurence prevention; and 5) finally the curent in vitro and in vivo studies and clinical trials of nanoparticles. The current review highlighted use of nanoparticles as novel tools in improving drug delivery to be effective for treatment patients with peritoneal carcinomatosis. PMID:29100461
Learning to rank diversified results for biomedical information retrieval from multiple features.
Wu, Jiajin; Huang, Jimmy; Ye, Zheng
2014-01-01
Different from traditional information retrieval (IR), promoting diversity in IR takes consideration of relationship between documents in order to promote novelty and reduce redundancy thus to provide diversified results to satisfy various user intents. Diversity IR in biomedical domain is especially important as biologists sometimes want diversified results pertinent to their query. A combined learning-to-rank (LTR) framework is learned through a general ranking model (gLTR) and a diversity-biased model. The former is learned from general ranking features by a conventional learning-to-rank approach; the latter is constructed with diversity-indicating features added, which are extracted based on the retrieved passages' topics detected using Wikipedia and ranking order produced by the general learning-to-rank model; final ranking results are given by combination of both models. Compared with baselines BM25 and DirKL on 2006 and 2007 collections, the gLTR has 0.2292 (+16.23% and +44.1% improvement over BM25 and DirKL respectively) and 0.1873 (+15.78% and +39.0% improvement over BM25 and DirKL respectively) in terms of aspect level of mean average precision (Aspect MAP). The LTR method outperforms gLTR on 2006 and 2007 collections with 4.7% and 2.4% improvement in terms of Aspect MAP. The learning-to-rank method is an efficient way for biomedical information retrieval and the diversity-biased features are beneficial for promoting diversity in ranking results.