Bryan A. Black; Daniel Griffin; Peter van der Sleen; Alan D. Wanamaker; James H. Speer; David C. Frank; David W. Stahle; Neil Pederson; Carolyn A. Copenheaver; Valerie Trouet; Shelly Griffin; Bronwyn M. Gillanders
2016-01-01
High-resolution biogenic and geologic proxies in which one increment or layer is formed per year are crucial to describing natural ranges of environmental variability in Earth's physical and biological systems. However, dating controls are necessary to ensure temporal precision and accuracy; simple counts cannot ensure that all layers are placed correctly in time...
The system of high accuracy UV spectral radiation system
NASA Astrophysics Data System (ADS)
Lin, Guan-yu; Yu, Lei; Xu, Dian; Cao, Dian-sheng; Yu, Yu-Xiang
2016-10-01
UV spectral radiation detecting and visible observation telescope is designed by the coaxial optical. In order to decrease due to the incident light polarization effect, and improve the detection precision, polarizer need to be used in the light path. Four pieces of quartz of high Precision UV radiation depolarizer retarder stack together is placed in front of Seya namioka dispersion unit. The coherent detection principle of modulation of light signal and the reference signal multiplied processing, increase the phase sensitive detector can be adjustment function, ensure the UV spectral radiation detection stability. A lock-in amplifier is used in the electrical system to advance the accuracy of measurement. To ensure the precision measurement detected, the phase-sensitive detector function can be adjustable. the output value is not more than 10mV before each measurement, so it can be ensured that the stability of the measured radiation spectrum is less than 1 percent.
40 CFR 89.310 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... to § 89.323. (c) Emission measurement accuracy—Bag sampling. (1) Good engineering practice dictates... generally not be used. (2) Some high resolution read-out systems, such as computers, data loggers, and so..., using good engineering judgement, below 15 percent of full scale are made to ensure the accuracy of the...
40 CFR 91.314 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... deflection should generally not be used. (2) Some high resolution read-out systems, such as computers, data...-second time interval. (b) Operating procedure for analyzers and sampling system. Follow the start-up and... systems may be used provided that additional calibrations are made to ensure the accuracy of the...
40 CFR 89.310 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... to § 89.323. (c) Emission measurement accuracy—Bag sampling. (1) Good engineering practice dictates... generally not be used. (2) Some high resolution read-out systems, such as computers, data loggers, and so..., using good engineering judgement, below 15 percent of full scale are made to ensure the accuracy of the...
40 CFR 89.310 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2013 CFR
2013-07-01
... to § 89.323. (c) Emission measurement accuracy—Bag sampling. (1) Good engineering practice dictates... generally not be used. (2) Some high resolution read-out systems, such as computers, data loggers, and so..., using good engineering judgement, below 15 percent of full scale are made to ensure the accuracy of the...
40 CFR 89.310 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... to § 89.323. (c) Emission measurement accuracy—Bag sampling. (1) Good engineering practice dictates... generally not be used. (2) Some high resolution read-out systems, such as computers, data loggers, and so..., using good engineering judgement, below 15 percent of full scale are made to ensure the accuracy of the...
40 CFR 89.310 - Analyzer accuracy and specifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... to § 89.323. (c) Emission measurement accuracy—Bag sampling. (1) Good engineering practice dictates... generally not be used. (2) Some high resolution read-out systems, such as computers, data loggers, and so..., using good engineering judgement, below 15 percent of full scale are made to ensure the accuracy of the...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2012 CFR
2012-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2011 CFR
2011-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2013 CFR
2013-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2014 CFR
2014-10-01
... ensure a consistent measurement of the work participation rates, including the quality assurance... work participation information? 261.62 Section 261.62 Public Welfare Regulations Relating to Public..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work...
Meuter, Renata F I; Lacherez, Philippe F
2016-03-01
We aimed to assess the impact of task demands and individual characteristics on threat detection in baggage screeners. Airport security staff work under time constraints to ensure optimal threat detection. Understanding the impact of individual characteristics and task demands on performance is vital to ensure accurate threat detection. We examined threat detection in baggage screeners as a function of event rate (i.e., number of bags per minute) and time on task across 4 months. We measured performance in terms of the accuracy of detection of Fictitious Threat Items (FTIs) randomly superimposed on X-ray images of real passenger bags. Analyses of the percentage of correct FTI identifications (hits) show that longer shifts with high baggage throughput result in worse threat detection. Importantly, these significant performance decrements emerge within the first 10 min of these busy screening shifts only. Longer shift lengths, especially when combined with high baggage throughput, increase the likelihood that threats go undetected. Shorter shift rotations, although perhaps difficult to implement during busy screening periods, would ensure more consistently high vigilance in baggage screeners and, therefore, optimal threat detection and passenger safety. © 2015, Human Factors and Ergonomics Society.
Method of measuring thermal conductivity of high performance insulation
NASA Technical Reports Server (NTRS)
Hyde, E. H.; Russell, L. D.
1968-01-01
Method accurately measures the thermal conductivity of high-performance sheet insulation as a discrete function of temperature. It permits measurements to be made at temperature drops of approximately 10 degrees F across the insulation and ensures measurement accuracy by minimizing longitudinal heat losses in the system.
45 CFR 261.62 - What must a State do to verify the accuracy of its work participation information?
Code of Federal Regulations, 2010 CFR
2010-10-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT RECIPIENTS WORK How Do We Ensure the Accuracy of Work... internal controls to ensure compliance with the procedures; and (5) Submit to the Secretary for approval... countable work activity; and (5) A description of the internal controls that the State has implemented to...
77 FR 47850 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... function; (2) the accuracy of the estimated burden; (3) ways to enhance the quality, utility, and clarity... care provided by managed care organizations under contract to CMS is of high quality. One way of ensuring high quality care in Medicare Managed Care Organizations (MCOs), or more commonly referred to as...
Development of CFRP mirrors for space telescopes
NASA Astrophysics Data System (ADS)
Utsunomiya, Shin; Kamiya, Tomohiro; Shimizu, Ryuzo
2013-09-01
CFRP (Caron fiber reinforced plastics) have superior properties of high specific elasticity and low thermal expansion for satellite telescope structures. However, difficulties to achieve required surface accuracy and to ensure stability in orbit have discouraged CFRP application as main mirrors. We have developed ultra-light weight and high precision CFRP mirrors of sandwich structures composed of CFRP skins and CFRP cores using a replica technique. Shape accuracy of the demonstrated mirrors of 150 mm in diameter was 0.8 μm RMS (Root Mean Square) and surface roughness was 5 nm RMS as fabricated. Further optimization of fabrication process conditions to improve surface accuracy was studied using flat sandwich panels. Then surface accuracy of the flat CFRP sandwich panels of 150 mm square was improved to flatness of 0.2 μm RMS with surface roughness of 6 nm RMS. The surface accuracy vs. size of trial models indicated high possibility of fabrication of over 1m size mirrors with surface accuracy of 1μm. Feasibility of CFRP mirrors for low temperature applications was examined for JASMINE project as an example. Stability of surface accuracy of CFRP mirrors against temperature and moisture was discussed.
Li, Yongkai; Yi, Ming; Zou, Xiufen
2014-01-01
To gain insights into the mechanisms of cell fate decision in a noisy environment, the effects of intrinsic and extrinsic noises on cell fate are explored at the single cell level. Specifically, we theoretically define the impulse of Cln1/2 as an indication of cell fates. The strong dependence between the impulse of Cln1/2 and cell fates is exhibited. Based on the simulation results, we illustrate that increasing intrinsic fluctuations causes the parallel shift of the separation ratio of Whi5P but that increasing extrinsic fluctuations leads to the mixture of different cell fates. Our quantitative study also suggests that the strengths of intrinsic and extrinsic noises around an approximate linear model can ensure a high accuracy of cell fate selection. Furthermore, this study demonstrates that the selection of cell fates is an entropy-decreasing process. In addition, we reveal that cell fates are significantly correlated with the range of entropy decreases. PMID:25042292
An Improved Method of AGM for High Precision Geolocation of SAR Images
NASA Astrophysics Data System (ADS)
Zhou, G.; He, C.; Yue, T.; Huang, W.; Huang, Y.; Li, X.; Chen, Y.
2018-05-01
In order to take full advantage of SAR images, it is necessary to obtain the high precision location of the image. During the geometric correction process of images, to ensure the accuracy of image geometric correction and extract the effective mapping information from the images, precise image geolocation is important. This paper presents an improved analytical geolocation method (IAGM) that determine the high precision geolocation of each pixel in a digital SAR image. This method is based on analytical geolocation method (AGM) proposed by X. K. Yuan aiming at realizing the solution of RD model. Tests will be conducted using RADARSAT-2 SAR image. Comparing the predicted feature geolocation with the position as determined by high precision orthophoto, results indicate an accuracy of 50m is attainable with this method. Error sources will be analyzed and some recommendations about improving image location accuracy in future spaceborne SAR's will be given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mezzacappa, Anthony; Endeve, Eirik; Hauck, Cory D.
We extend the positivity-preserving method of Zhang & Shu [49] to simulate the advection of neutral particles in phase space using curvilinear coordinates. The ability to utilize these coordinates is important for non-equilibrium transport problems in general relativity and also in science and engineering applications with specific geometries. The method achieves high-order accuracy using Discontinuous Galerkin (DG) discretization of phase space and strong stabilitypreserving, Runge-Kutta (SSP-RK) time integration. Special care in taken to ensure that the method preserves strict bounds for the phase space distribution function f; i.e., f ϵ [0, 1]. The combination of suitable CFL conditions and themore » use of the high-order limiter proposed in [49] is su cient to ensure positivity of the distribution function. However, to ensure that the distribution function satisfies the upper bound, the discretization must, in addition, preserve the divergencefree property of the phase space ow. Proofs that highlight the necessary conditions are presented for general curvilinear coordinates, and the details of these conditions are worked out for some commonly used coordinate systems (i.e., spherical polar spatial coordinates in spherical symmetry and cylindrical spatial coordinates in axial symmetry, both with spherical momentum coordinates). Results from numerical experiments - including one example in spherical symmetry adopting the Schwarzschild metric - demonstrate that the method achieves high-order accuracy and that the distribution function satisfies the maximum principle.« less
NASA Astrophysics Data System (ADS)
Boscheri, Walter; Dumbser, Michael; Loubère, Raphaël; Maire, Pierre-Henri
2018-04-01
In this paper we develop a conservative cell-centered Lagrangian finite volume scheme for the solution of the hydrodynamics equations on unstructured multidimensional grids. The method is derived from the Eucclhyd scheme discussed in [47,43,45]. It is second-order accurate in space and is combined with the a posteriori Multidimensional Optimal Order Detection (MOOD) limiting strategy to ensure robustness and stability at shock waves. Second-order of accuracy in time is achieved via the ADER (Arbitrary high order schemes using DERivatives) approach. A large set of numerical test cases is proposed to assess the ability of the method to achieve effective second order of accuracy on smooth flows, maintaining an essentially non-oscillatory behavior on discontinuous profiles, general robustness ensuring physical admissibility of the numerical solution, and precision where appropriate.
Improved hairline crack detector and poor shell-quality eggs
USDA-ARS?s Scientific Manuscript database
Cracks frequently occur throughout various points of egg collection and processing and there are numerous high-speed online commercial crack detectors in use. The accuracy of crack detectors is validated by USDA human graders to ensure that they are in compliance with voluntary grade standards USDA...
Study on high-precision measurement of long radius of curvature
NASA Astrophysics Data System (ADS)
Wu, Dongcheng; Peng, Shijun; Gao, Songtao
2016-09-01
It is hard to get high-precision measurement of the radius of curvature (ROC), because of many factors that affect the measurement accuracy. For the measurement of long radius of curvature, some factors take more important position than others'. So, at first this paper makes some research about which factor is related to the long measurement distance, and also analyse the uncertain of the measurement accuracy. At second this article also study the influence about the support status and the adjust error about the cat's eye and confocal position. At last, a 1055micrometer radius of curvature convex is measured in high-precision laboratory. Experimental results show that the proper steady support (three-point support) can guarantee the high-precision measurement of radius of curvature. Through calibrating the gain of cat's eye and confocal position, is useful to ensure the precise position in order to increase the measurement accuracy. After finish all the above process, the high-precision long ROC measurement is realized.
Pardo, Scott; Simmons, David A
2016-09-01
The relationship between International Organization for Standardization (ISO) accuracy criteria and mean absolute relative difference (MARD), 2 methods for assessing the accuracy of blood glucose meters, is complex. While lower MARD values are generally better than higher MARD values, it is not possible to define a particular MARD value that ensures a blood glucose meter will satisfy the ISO accuracy criteria. The MARD value that ensures passing the ISO accuracy test can be described only as a probabilistic range. In this work, a Bayesian model is presented to represent the relationship between ISO accuracy criteria and MARD. Under the assumptions made in this work, there is nearly a 100% chance of satisfying ISO 15197:2013 accuracy requirements if the MARD value is between 3.25% and 5.25%. © 2016 Diabetes Technology Society.
Towards SSVEP-based, portable, responsive Brain-Computer Interface.
Kaczmarek, Piotr; Salomon, Pawel
2015-08-01
A Brain-Computer Interface in motion control application requires high system responsiveness and accuracy. SSVEP interface consisted of 2-8 stimuli and 2 channel EEG amplifier was presented in this paper. The observed stimulus is recognized based on a canonical correlation calculated in 1 second window, ensuring high interface responsiveness. A threshold classifier with hysteresis (T-H) was proposed for recognition purposes. Obtained results suggest that T-H classifier enables to significantly increase classifier performance (resulting in accuracy of 76%, while maintaining average false positive detection rate of stimulus different then observed one between 2-13%, depending on stimulus frequency). It was shown that the parameters of T-H classifier, maximizing true positive rate, can be estimated by gradient-based search since the single maximum was observed. Moreover the preliminary results, performed on a test group (N=4), suggest that for T-H classifier exists a certain set of parameters for which the system accuracy is similar to accuracy obtained for user-trained classifier.
Bryant, Ginelle A.; Haack, Sally L.; North, Andrew M.
2013-01-01
Objective. To compare student accuracy in measuring normal and high blood pressures using a simulator arm. Methods. In this prospective, single-blind, study involving third-year pharmacy students, simulator arms were programmed with prespecified normal and high blood pressures. Students measured preset normal and high diastolic and systolic blood pressure using a crossover design. Results. One hundred sixteen students completed both blood pressure measurements. There was a significant difference between the accuracy of high systolic blood pressure (HSBP) measurement and normal systolic blood pressure (NSBP) measurement (mean HSBP difference 8.4 ± 10.9 mmHg vs NSBP 3.6 ± 6.4 mmHg; p<0.001). However, there was no difference between the accuracy of high diastolic blood pressure (HDBP) measurement and normal diastolic blood pressure (NDBP) measurement (mean HDBP difference 6.8 ± 9.6 mmHg vs. mean NDBP difference 4.6 ± 4.5 mmHg; p=0.089). Conclusions. Pharmacy students may need additional instruction and experience with taking high blood pressure measurements to ensure they are able to accurately assess this important vital sign. PMID:23788809
Bottenberg, Michelle M; Bryant, Ginelle A; Haack, Sally L; North, Andrew M
2013-06-12
To compare student accuracy in measuring normal and high blood pressures using a simulator arm. In this prospective, single-blind, study involving third-year pharmacy students, simulator arms were programmed with prespecified normal and high blood pressures. Students measured preset normal and high diastolic and systolic blood pressure using a crossover design. One hundred sixteen students completed both blood pressure measurements. There was a significant difference between the accuracy of high systolic blood pressure (HSBP) measurement and normal systolic blood pressure (NSBP) measurement (mean HSBP difference 8.4 ± 10.9 mmHg vs NSBP 3.6 ± 6.4 mmHg; p<0.001). However, there was no difference between the accuracy of high diastolic blood pressure (HDBP) measurement and normal diastolic blood pressure (NDBP) measurement (mean HDBP difference 6.8 ± 9.6 mmHg vs. mean NDBP difference 4.6 ± 4.5 mmHg; p=0.089). Pharmacy students may need additional instruction and experience with taking high blood pressure measurements to ensure they are able to accurately assess this important vital sign.
Recent developments in heterodyne laser interferometry at Harbin Institute of Technology
NASA Astrophysics Data System (ADS)
Hu, P. C.; Tan, J. B. B.; Yang, H. X. X.; Fu, H. J. J.; Wang, Q.
2013-01-01
In order to fulfill the requirements for high-resolution and high-precision heterodyne interferometric technologies and instruments, the laser interferometry group of HIT has developed some novel techniques for high-resolution and high-precision heterodyne interferometers, such as high accuracy laser frequency stabilization, dynamic sub-nanometer resolution phase interpolation and dynamic nonlinearity measurement. Based on a novel lock point correction method and an asymmetric thermal structure, the frequency stabilized laser achieves a long term stability of 1.2×10-8, and it can be steadily stabilized even in the air flowing up to 1 m/s. In order to achieve dynamic sub-nanometer resolution of laser heterodyne interferometers, a novel phase interpolation method based on digital delay line is proposed. Experimental results show that, the proposed 0.62 nm, phase interpolator built with a 64 multiple PLL and an 8-tap digital delay line achieves a static accuracy better than 0.31nm and a dynamic accuracy better than 0.62 nm over the velocity ranging from -2 m/s to 2 m/s. Meanwhile, an accuracy beam polarization measuring setup is proposed to check and ensure the light's polarization state of the dual frequency laser head, and a dynamic optical nonlinearity measuring setup is built to measure the optical nonlinearity of the heterodyne system accurately and quickly. Analysis and experimental results show that, the beam polarization measuring setup can achieve an accuracy of 0.03° in ellipticity angles and an accuracy of 0.04° in the non-orthogonality angle respectively, and the optical nonlinearity measuring setup can achieve an accuracy of 0.13°.
A new head phantom with realistic shape and spatially varying skull resistivity distribution.
Li, Jian-Bo; Tang, Chi; Dai, Meng; Liu, Geng; Shi, Xue-Tao; Yang, Bin; Xu, Can-Hua; Fu, Feng; You, Fu-Sheng; Tang, Meng-Xing; Dong, Xiu-Zhen
2014-02-01
Brain electrical impedance tomography (EIT) is an emerging method for monitoring brain injuries. To effectively evaluate brain EIT systems and reconstruction algorithms, we have developed a novel head phantom that features realistic anatomy and spatially varying skull resistivity. The head phantom was created with three layers, representing scalp, skull, and brain tissues. The fabrication process entailed 3-D printing of the anatomical geometry for mold creation followed by casting to ensure high geometrical precision and accuracy of the resistivity distribution. We evaluated the accuracy and stability of the phantom. Results showed that the head phantom achieved high geometric accuracy, accurate skull resistivity values, and good stability over time and in the frequency domain. Experimental impedance reconstructions performed using the head phantom and computer simulations were found to be consistent for the same perturbation object. In conclusion, this new phantom could provide a more accurate test platform for brain EIT research.
Importance of Calibration/Validation Traceability for Multi-Sensor Imaging Spectrometry Applications
NASA Technical Reports Server (NTRS)
Thome, K.
2017-01-01
Knowledge of calibration traceability is essential for ensuring the quality of data products relying on multiple sensors and especially true for imaging spectrometers. The current work discusses the expected impact that imaging spectrometers have in ensuring radiometric traceability for both multispectral and hyperspectral products. The Climate Absolute Radiance and Refractivity Observatory Pathfinder mission is used to show the role that high-accuracy imaging spectrometers can play in understanding test sites used for vicarious calibration of sensors. The associated Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer calibration demonstration system is used to illustrate recent advances in laboratory radiometric calibration approaches that will allow both the use of imaging spectrometers as calibration standards as well as to ensure the consistency of the multiple imaging spectrometers expected to be on orbit in the next decade.
Highly efficient, very low-thrust transfer to geosynchronous orbit - Exact and approximate solutions
NASA Astrophysics Data System (ADS)
Redding, D. C.
1984-04-01
An overview is provided of the preflight, postflight, and accuracy analysis of the Titan IIIC launch vehicle that injects payloads into geosynchronous orbits. The postflight trajectory reconstruction plays an important role in determining payload injection accuracy. Furthermore, the postflight analysis provides useful information about the characteristics of measuring instruments subjected to a flight environment. Suitable approaches for meeting mission specifications, trajectory requirements, and instrument constraints are considered, taking into account the importance of preflight trajectory analysis activities. Gimbal flip avoidance algorithms in the flight software, and considerable beta gimbal analysis ensures a singularity-free trajectory.
A Framework for Human Microbiome Research
2012-06-14
determined that many compo- nents of data production and processing can contribute errors and artefacts. We investigated methods that avoid these errors and...protocol that ensured consistency in the high-throughput production . To maximize accuracy and consistency, protocols were evaluated primarily using a...future benefits, this resource may promote the development of novel prophylactic strategies such as the application of prebiotics and probiotics to
Sun, Ting; Xing, Fei; You, Zheng; Wang, Xiaochu; Li, Bin
2014-03-10
The star tracker is one of the most promising attitude measurement devices widely used in spacecraft for its high accuracy. High dynamic performance is becoming its major restriction, and requires immediate focus and promotion. A star image restoration approach based on the motion degradation model of variable angular velocity is proposed in this paper. This method can overcome the problem of energy dispersion and signal to noise ratio (SNR) decrease resulting from the smearing of the star spot, thus preventing failed extraction and decreased star centroid accuracy. Simulations and laboratory experiments are conducted to verify the proposed methods. The restoration results demonstrate that the described method can recover the star spot from a long motion trail to the shape of Gaussian distribution under the conditions of variable angular velocity and long exposure time. The energy of the star spot can be concentrated to ensure high SNR and high position accuracy. These features are crucial to the subsequent star extraction and the whole performance of the star tracker.
Automated seamline detection along skeleton for remote sensing image mosaicking
NASA Astrophysics Data System (ADS)
Zhang, Hansong; Chen, Jianyu; Liu, Xin
2015-08-01
The automatic generation of seamline along the overlap region skeleton is a concerning problem for the mosaicking of Remote Sensing(RS) images. Along with the improvement of RS image resolution, it is necessary to ensure rapid and accurate processing under complex conditions. So an automated seamline detection method for RS image mosaicking based on image object and overlap region contour contraction is introduced. By this means we can ensure universality and efficiency of mosaicking. The experiments also show that this method can select seamline of RS images with great speed and high accuracy over arbitrary overlap regions, and realize RS image rapid mosaicking in surveying and mapping production.
Gain-Compensating Circuit For NDE and Ultrasonics
NASA Technical Reports Server (NTRS)
Kushnick, Peter W.
1987-01-01
High-frequency gain-compensating circuit designed for general use in nondestructive evaluation and ultrasonic measurements. Controls gain of ultrasonic receiver as function of time to aid in measuring attenuation of samples with high losses; for example, human skin and graphite/epoxy composites. Features high signal-to-noise ratio, large signal bandwidth and large dynamic range. Control bandwidth of 5 MHz ensures accuracy of control signal. Currently being used for retrieval of more information from ultrasonic signals sent through composite materials that have high losses, and to measure skin-burn depth in humans.
Highly accurate symplectic element based on two variational principles
NASA Astrophysics Data System (ADS)
Qing, Guanghui; Tian, Jia
2018-02-01
For the stability requirement of numerical resultants, the mathematical theory of classical mixed methods are relatively complex. However, generalized mixed methods are automatically stable, and their building process is simple and straightforward. In this paper, based on the seminal idea of the generalized mixed methods, a simple, stable, and highly accurate 8-node noncompatible symplectic element (NCSE8) was developed by the combination of the modified Hellinger-Reissner mixed variational principle and the minimum energy principle. To ensure the accuracy of in-plane stress results, a simultaneous equation approach was also suggested. Numerical experimentation shows that the accuracy of stress results of NCSE8 are nearly the same as that of displacement methods, and they are in good agreement with the exact solutions when the mesh is relatively fine. NCSE8 has advantages of the clearing concept, easy calculation by a finite element computer program, higher accuracy and wide applicability for various linear elasticity compressible and nearly incompressible material problems. It is possible that NCSE8 becomes even more advantageous for the fracture problems due to its better accuracy of stresses.
NASA Astrophysics Data System (ADS)
Vorontsov, S. V.; Kuvshinov, M. I.; Narozhnyi, A. T.; Popov, V. A.; Solov'ev, V. P.; Yuferev, V. I.
2017-12-01
A reactor with a destructible core (RIR reactor) generating a pulse with an output of 1.5 × 1019 fissions and a full width at half maximum of 2.5 μs was developed and tested at VNIIEF. In the course of investigation, a computational-experimental method for laboratory calibration of the reactor was created and worked out. This method ensures a high accuracy of predicting the energy release in a real experiment with excess reactivity of 3βeff above prompt criticality. A transportable explosion-proof chamber was also developed, which ensures the safe localization of explosion products of the core of small-sized nuclear devices and charges of high explosives with equivalent mass of up to 100 kg of TNT.
NASA Astrophysics Data System (ADS)
Scholz, Pascal A.; Andrianov, Victor; Echler, Artur; Egelhof, Peter; Kilbourne, Caroline; Kiselev, Oleg; Kraft-Bermuth, Saskia; McCammon, Dan
2017-10-01
X-ray spectroscopy on highly charged heavy ions provides a sensitive test of quantum electrodynamics in very strong Coulomb fields. One limitation of the current accuracy of such experiments is the energy resolution of available X-ray detectors for energies up to 100 keV. To improve this accuracy, a novel detector concept, namely the concept of microcalorimeters, is exploited for this kind of measurements. The microcalorimeters used in the present experiments consist of silicon thermometers, ensuring a high dynamic range, and of absorbers made of high-Z material to provide high X-ray absorption efficiency. Recently, besides an earlier used detector, a new compact detector design, housed in a new dry cryostat equipped with a pulse tube cooler, was applied at a test beamtime at the experimental storage ring (ESR) of the GSI facility in Darmstadt. A U89+ beam at 75 MeV/u and a 124Xe54+ beam at various beam energies, both interacting with an internal gas-jet target, were used in different cycles. This test was an important benchmark for designing a larger array with an improved lateral sensitivity and statistical accuracy.
High-throughput accurate-wavelength lens-based visible spectrometer.
Bell, Ronald E; Scotti, Filippo
2010-10-01
A scanning visible spectrometer has been prototyped to complement fixed-wavelength transmission grating spectrometers for charge exchange recombination spectroscopy. Fast f/1.8 200 mm commercial lenses are used with a large 2160 mm(-1) grating for high throughput. A stepping-motor controlled sine drive positions the grating, which is mounted on a precision rotary table. A high-resolution optical encoder on the grating stage allows the grating angle to be measured with an absolute accuracy of 0.075 arc sec, corresponding to a wavelength error ≤0.005 Å. At this precision, changes in grating groove density due to thermal expansion and variations in the refractive index of air are important. An automated calibration procedure determines all the relevant spectrometer parameters to high accuracy. Changes in bulk grating temperature, atmospheric temperature, and pressure are monitored between the time of calibration and the time of measurement to ensure a persistent wavelength calibration.
ERIC Educational Resources Information Center
Dunn, Peter
2008-01-01
Quality encompasses a very broad range of ideas in learning materials, yet the accuracy of the content is often overlooked as a measure of quality. Various aspects of accuracy are briefly considered, and the issue of computational accuracy is then considered further. When learning materials are produced containing the results of mathematical…
Ensuring the Consistency of Silicide Coatings
NASA Technical Reports Server (NTRS)
Ramani, V.; Lampson, F. K.
1982-01-01
Diagram specifies optimum fusing time for given thicknesses of refractory metal-silicide coatings on columbium C-103 substrates. Adherence to indicated fusion times ensures consistent coatings and avoids underdiffusion and overdiffusion. Accuracy of diagram has been confirmed by tests.
A Kinematic Calibration Process for Flight Robotic Arms
NASA Technical Reports Server (NTRS)
Collins, Curtis L.; Robinson, Matthew L.
2013-01-01
The Mars Science Laboratory (MSL) robotic arm is ten times more massive than any Mars robotic arm before it, yet with similar accuracy and repeatability positioning requirements. In order to assess and validate these requirements, a higher-fidelity model and calibration processes were needed. Kinematic calibration of robotic arms is a common and necessary process to ensure good positioning performance. Most methodologies assume a rigid arm, high-accuracy data collection, and some kind of optimization of kinematic parameters. A new detailed kinematic and deflection model of the MSL robotic arm was formulated in the design phase and used to update the initial positioning and orientation accuracy and repeatability requirements. This model included a higher-fidelity link stiffness matrix representation, as well as a link level thermal expansion model. In addition, it included an actuator backlash model. Analytical results highlighted the sensitivity of the arm accuracy to its joint initialization methodology. Because of this, a new technique for initializing the arm joint encoders through hardstop calibration was developed. This involved selecting arm configurations to use in Earth-based hardstop calibration that had corresponding configurations on Mars with the same joint torque to ensure repeatability in the different gravity environment. The process used to collect calibration data for the arm included the use of multiple weight stand-in turrets with enough metrology targets to reconstruct the full six-degree-of-freedom location of the rover and tool frames. The follow-on data processing of the metrology data utilized a standard differential formulation and linear parameter optimization technique.
DOTD standards for GPS data collection accuracy : [tech summary].
DOT National Transportation Integrated Search
2015-09-01
Positional data collection e orts performed by personnel and contractors of the Louisiana Department of Transportation and Development : (DOTD) requires a reliable and consistent measurement framework for ensuring accuracy and precision. Global Na...
High-order centered difference methods with sharp shock resolution
NASA Technical Reports Server (NTRS)
Gustafsson, Bertil; Olsson, Pelle
1994-01-01
In this paper we consider high-order centered finite difference approximations of hyperbolic conservation laws. We propose different ways of adding artificial viscosity to obtain sharp shock resolution. For the Riemann problem we give simple explicit formulas for obtaining stationary one and two-point shocks. This can be done for any order of accuracy. It is shown that the addition of artificial viscosity is equivalent to ensuring the Lax k-shock condition. We also show numerical experiments that verify the theoretical results.
CoLiTec software - detection of the near-zero apparent motion
NASA Astrophysics Data System (ADS)
Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.
2017-06-01
In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-01-01
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument. PMID:29621142
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-04-05
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.
The role of dosimetry audit in lung SBRT multi-centre clinical trials.
Clark, Catharine H; Hurkmans, Coen W; Kry, Stephen F
2017-12-01
Stereotactic Body Radiotherapy (SBRT) in the lung is a challenging technique which requires high quality clinical trials to answer the un-resolved clinical questions. Quality assurance of these clinical trials not only ensures the safety of the treatment of the participating patients but also minimises the variation in treatment, thus allowing the lowest number of patient treatments to answer the trial question. This review addresses the role of dosimetry audits in the quality assurance process and considers what can be done to ensure the highest accuracy of dose calculation and delivery and it's assessment in multi-centre trials. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Improving accuracy of unbound resilient modulus testing
DOT National Transportation Integrated Search
1997-07-01
The P46 Laboratory Startup and Quality Control Procedure was developed to ensure the accuracy and reliability of the resilient modulus data produced while testing soil and aggregate materials using closed-loop servo-hydraulic systems. It was develope...
NASA Astrophysics Data System (ADS)
Stewart, James M. P.; Ansell, Steve; Lindsay, Patricia E.; Jaffray, David A.
2015-12-01
Advances in precision microirradiators for small animal radiation oncology studies have provided the framework for novel translational radiobiological studies. Such systems target radiation fields at the scale required for small animal investigations, typically through a combination of on-board computed tomography image guidance and fixed, interchangeable collimators. Robust targeting accuracy of these radiation fields remains challenging, particularly at the millimetre scale field sizes achievable by the majority of microirradiators. Consistent and reproducible targeting accuracy is further hindered as collimators are removed and inserted during a typical experimental workflow. This investigation quantified this targeting uncertainty and developed an online method based on a virtual treatment isocenter to actively ensure high performance targeting accuracy for all radiation field sizes. The results indicated that the two-dimensional field placement uncertainty was as high as 1.16 mm at isocenter, with simulations suggesting this error could be reduced to 0.20 mm using the online correction method. End-to-end targeting analysis of a ball bearing target on radiochromic film sections showed an improved targeting accuracy with the three-dimensional vector targeting error across six different collimators reduced from 0.56+/- 0.05 mm (mean ± SD) to 0.05+/- 0.05 mm for an isotropic imaging voxel size of 0.1 mm.
Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng
2017-06-20
The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.
Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun
2016-01-01
Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite’s on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%. PMID:27483287
NASA Technical Reports Server (NTRS)
Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William
2012-01-01
AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.
Verification of Software: The Textbook and Real Problems
NASA Technical Reports Server (NTRS)
Carlson, Jan-Renee
2006-01-01
The process of verification, or determining the order of accuracy of computational codes, can be problematic when working with large, legacy computational methods that have been used extensively in industry or government. Verification does not ensure that the computer program is producing a physically correct solution, it ensures merely that the observed order of accuracy of solutions are the same as the theoretical order of accuracy. The Method of Manufactured Solutions (MMS) is one of several ways for determining the order of accuracy. MMS is used to verify a series of computer codes progressing in sophistication from "textbook" to "real life" applications. The degree of numerical precision in the computations considerably influenced the range of mesh density to achieve the theoretical order of accuracy even for 1-D problems. The choice of manufactured solutions and mesh form shifted the observed order in specific areas but not in general. Solution residual (iterative) convergence was not always achieved for 2-D Euler manufactured solutions. L(sub 2,norm) convergence differed variable to variable therefore an observed order of accuracy could not be determined conclusively in all cases, the cause of which is currently under investigation.
NASA Technical Reports Server (NTRS)
Holekamp, Kara; Aaron, David; Thome, Kurtis
2006-01-01
Radiometric calibration of commercial imaging satellite products is required to ensure that science and application communities can better understand their properties. Inaccurate radiometric calibrations can lead to erroneous decisions and invalid conclusions and can limit intercomparisons with other systems. To address this calibration need, satellite at-sensor radiance values were compared to those estimated by each independent team member to determine the sensor's radiometric accuracy. The combined results of this evaluation provide the user community with an independent assessment of these commercially available high spatial resolution sensors' absolute calibration values.
Introducing a feedback training system for guided home rehabilitation.
Kohler, Fabian; Schmitz-Rode, Thomas; Disselhorst-Klug, Catherine
2010-01-15
As the number of people requiring orthopaedic intervention is growing, individualized physiotherapeutic rehabilitation and adequate postoperative care becomes increasingly relevant. The chances of improvement in the patients condition is directly related to the performance and consistency of the physiotherapeutic exercises.In this paper a smart, cost-effective and easy to use Feedback Training System for home rehabilitation based on standard resistive elements is introduced. This ensures high accuracy of the exercises performed and offers guidance and control to the patient by offering direct feedback about the performance of the movements.46 patients were recruited and performed standard physiotherapeutic training to evaluate the system. The results show a significant increase in the patient's ability to reproduce even simple physiotherapeutic exercises when being supported by the Feedback Training System. Thus physiotherapeutic training can be extended into the home environment whilst ensuring a high quality of training.
[Data validation methods and discussion on Chinese materia medica resource survey].
Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing
2013-07-01
From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.
SFOL Pulse: A High Accuracy DME Pulse for Alternative Aircraft Position and Navigation.
Kim, Euiho; Seo, Jiwon
2017-09-22
In the Federal Aviation Administration's (FAA) performance based navigation strategy announced in 2016, the FAA stated that it would retain and expand the Distance Measuring Equipment (DME) infrastructure to ensure resilient aircraft navigation capability during the event of a Global Navigation Satellite System (GNSS) outage. However, the main drawback of the DME as a GNSS back up system is that it requires a significant expansion of the current DME ground infrastructure due to its poor distance measuring accuracy over 100 m. The paper introduces a method to improve DME distance measuring accuracy by using a new DME pulse shape. The proposed pulse shape was developed by using Genetic Algorithms and is less susceptible to multipath effects so that the ranging error reduces by 36.0-77.3% when compared to the Gaussian and Smoothed Concave Polygon DME pulses, depending on noise environment.
SFOL Pulse: A High Accuracy DME Pulse for Alternative Aircraft Position and Navigation
Kim, Euiho
2017-01-01
In the Federal Aviation Administration’s (FAA) performance based navigation strategy announced in 2016, the FAA stated that it would retain and expand the Distance Measuring Equipment (DME) infrastructure to ensure resilient aircraft navigation capability during the event of a Global Navigation Satellite System (GNSS) outage. However, the main drawback of the DME as a GNSS back up system is that it requires a significant expansion of the current DME ground infrastructure due to its poor distance measuring accuracy over 100 m. The paper introduces a method to improve DME distance measuring accuracy by using a new DME pulse shape. The proposed pulse shape was developed by using Genetic Algorithms and is less susceptible to multipath effects so that the ranging error reduces by 36.0–77.3% when compared to the Gaussian and Smoothed Concave Polygon DME pulses, depending on noise environment. PMID:28937615
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Assessment of Rocketborne and Airborne Infrared Data.
1978-11-30
photometers can be used to ensure photometric accuracy of the video signals. Accuracy is otherwise limited by "shading" nonuniformity and other...3-- reflective subsatellite in altitudes variable;: limb -occultation orbit; heter- (various SWIR, LWIR ) odyne detection NO profile 70-150km 4 Nd:YAG
NASA Astrophysics Data System (ADS)
Gao, Chunfeng; Wei, Guo; Wang, Qi; Xiong, Zhenyu; Wang, Qun; Long, Xingwu
2016-10-01
As an indispensable equipment in inertial technology tests, the three-axis turntable is widely used in the calibration of various types inertial navigation systems (INS). In order to ensure the calibration accuracy of INS, we need to accurately measure the initial state of the turntable. However, the traditional measuring method needs a lot of exterior equipment (such as level instrument, north seeker, autocollimator, etc.), and the test processing is complex, low efficiency. Therefore, it is relatively difficult for the inertial measurement equipment manufacturers to realize the self-inspection of the turntable. Owing to the high precision attitude information provided by the laser gyro strapdown inertial navigation system (SINS) after fine alignment, we can use it as the attitude reference of initial state measurement of three-axis turntable. For the principle that the fixed rotation vector increment is not affected by measuring point, we use the laser gyro INS and the encoder of the turntable to provide the attitudes of turntable mounting plat. Through this way, the high accuracy measurement of perpendicularity error and initial attitude of the three-axis turntable has been achieved.
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... scales or methods used for accounting purposes. (3) Document the procedures used to ensure the accuracy of the monthly measurements of trona consumed. (b) If you calculate CO2 process emissions based on... your facility, or methods used for accounting purposes. (3) Document the procedures used to ensure the...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... standards that require the use of the best available technology for ensuring the full reliability and... available technology for ensuring the full reliability and accuracy of urine drug tests, while reflecting..., cutoffs, specimen validity, collection, collection devices, and testing. II. Solicitation of Comments: As...
Accurate label-free 3-part leukocyte recognition with single cell lens-free imaging flow cytometry.
Li, Yuqian; Cornelis, Bruno; Dusa, Alexandra; Vanmeerbeeck, Geert; Vercruysse, Dries; Sohn, Erik; Blaszkiewicz, Kamil; Prodanov, Dimiter; Schelkens, Peter; Lagae, Liesbet
2018-05-01
Three-part white blood cell differentials which are key to routine blood workups are typically performed in centralized laboratories on conventional hematology analyzers operated by highly trained staff. With the trend of developing miniaturized blood analysis tool for point-of-need in order to accelerate turnaround times and move routine blood testing away from centralized facilities on the rise, our group has developed a highly miniaturized holographic imaging system for generating lens-free images of white blood cells in suspension. Analysis and classification of its output data, constitutes the final crucial step ensuring appropriate accuracy of the system. In this work, we implement reference holographic images of single white blood cells in suspension, in order to establish an accurate ground truth to increase classification accuracy. We also automate the entire workflow for analyzing the output and demonstrate clear improvement in the accuracy of the 3-part classification. High-dimensional optical and morphological features are extracted from reconstructed digital holograms of single cells using the ground-truth images and advanced machine learning algorithms are investigated and implemented to obtain 99% classification accuracy. Representative features of the three white blood cell subtypes are selected and give comparable results, with a focus on rapid cell recognition and decreased computational cost. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Jensen, Pamela K; Wujcik, Chad E; McGuire, Michelle K; McGuire, Mark A
2016-01-01
Simple high-throughput procedures were developed for the direct analysis of glyphosate [N-(phosphonomethyl)glycine] and aminomethylphosphonic acid (AMPA) in human and bovine milk and human urine matrices. Samples were extracted with an acidified aqueous solution on a high-speed shaker. Stable isotope labeled internal standards were added with the extraction solvent to ensure accurate tracking and quantitation. An additional cleanup procedure using partitioning with methylene chloride was required for milk matrices to minimize the presence of matrix components that can impact the longevity of the analytical column. Both analytes were analyzed directly, without derivatization, by liquid chromatography tandem mass spectrometry using two separate precursor-to-product transitions that ensure and confirm the accuracy of the measured results. Method performance was evaluated during validation through a series of assessments that included linearity, accuracy, precision, selectivity, ionization effects and carryover. Limits of quantitation (LOQ) were determined to be 0.1 and 10 µg/L (ppb) for urine and milk, respectively, for both glyphosate and AMPA. Mean recoveries for all matrices were within 89-107% at three separate fortification levels including the LOQ. Precision for replicates was ≤ 7.4% relative standard deviation (RSD) for milk and ≤ 11.4% RSD for urine across all fortification levels. All human and bovine milk samples used for selectivity and ionization effects assessments were free of any detectable levels of glyphosate and AMPA. Some of the human urine samples contained trace levels of glyphosate and AMPA, which were background subtracted for accuracy assessments. Ionization effects testing showed no significant biases from the matrix. A successful independent external validation was conducted using the more complicated milk matrices to demonstrate method transferability.
Understanding vs. Competency: The Case of Accuracy Checking Dispensed Medicines in Pharmacy
ERIC Educational Resources Information Center
James, K. Lynette; Davies, J. Graham; Kinchin, Ian; Patel, Jignesh P.; Whittlesea, Cate
2010-01-01
Ensuring the competence of healthcare professionals' is core to undergraduate and post-graduate education. Undergraduate pharmacy students and pre-registration graduates are required to demonstrate competence at dispensing and accuracy checking medicines. However, competence differs from understanding. This study determined the competence and…
Study of multi-functional precision optical measuring system for large scale equipment
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi
2017-10-01
The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.
Predict the fatigue life of crack based on extended finite element method and SVR
NASA Astrophysics Data System (ADS)
Song, Weizhen; Jiang, Zhansi; Jiang, Hui
2018-05-01
Using extended finite element method (XFEM) and support vector regression (SVR) to predict the fatigue life of plate crack. Firstly, the XFEM is employed to calculate the stress intensity factors (SIFs) with given crack sizes. Then predicetion model can be built based on the function relationship of the SIFs with the fatigue life or crack length. Finally, according to the prediction model predict the SIFs at different crack sizes or different cycles. Because of the accuracy of the forward Euler method only ensured by the small step size, a new prediction method is presented to resolve the issue. The numerical examples were studied to demonstrate the proposed method allow a larger step size and have a high accuracy.
Model Predictions and Observed Performance of JWST's Cryogenic Position Metrology System
NASA Technical Reports Server (NTRS)
Lunt, Sharon R.; Rhodes, David; DiAntonio, Andrew; Boland, John; Wells, Conrad; Gigliotti, Trevis; Johanning, Gary
2016-01-01
The James Webb Space Telescope cryogenic testing requires measurement systems that both obtain a very high degree of accuracy and can function in that environment. Close-range photogrammetry was identified as meeting those criteria. Testing the capability of a close-range photogrammetric system prior to its existence is a challenging problem. Computer simulation was chosen over building a scaled mock-up to allow for increased flexibility in testing various configurations. Extensive validation work was done to ensure that the actual as-built system meet accuracy and repeatability requirements. The simulated image data predicted the uncertainty in measurement to be within specification and this prediction was borne out experimentally. Uncertainty at all levels was verified experimentally to be less than 0.1 millimeters.
Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery
NASA Astrophysics Data System (ADS)
Qiu, Chunping; Schmitt, Michael; Zhu, Xiao Xiang
2018-04-01
In this paper we discuss the potential and challenges regarding SAR-optical stereogrammetry for urban areas, using very-high-resolution (VHR) remote sensing imagery. Since we do this mainly from a geometrical point of view, we first analyze the height reconstruction accuracy to be expected for different stereogrammetric configurations. Then, we propose a strategy for simultaneous tie point matching and 3D reconstruction, which exploits an epipolar-like search window constraint. To drive the matching and ensure some robustness, we combine different established hand-crafted similarity measures. For the experiments, we use real test data acquired by the Worldview-2, TerraSAR-X and MEMPHIS sensors. Our results show that SAR-optical stereogrammetry using VHR imagery is generally feasible with 3D positioning accuracies in the meter-domain, although the matching of these strongly hetereogeneous multi-sensor data remains very challenging.
Arrester Resistive Current Measuring System Based on Heterogeneous Network
NASA Astrophysics Data System (ADS)
Zhang, Yun Hua; Li, Zai Lin; Yuan, Feng; Hou Pan, Feng; Guo, Zhan Nan; Han, Yue
2018-03-01
Metal Oxide Arrester (MOA) suffers from aging and poor insulation due to long-term impulse voltage and environmental impact, and the value and variation tendency of resistive current can reflect the health conditions of MOA. The common wired MOA detection need to use long cables, which is complicated to operate, and that wireless measurement methods are facing the problems of poor data synchronization and instability. Therefore a novel synchronous measurement system of arrester current resistive based on heterogeneous network is proposed, which simplifies the calculation process and improves synchronization, accuracy and stability and of the measuring system. This system combines LoRa wireless network, high speed wireless personal area network and the process layer communication, and realizes the detection of arrester working condition. Field test data shows that the system has the characteristics of high accuracy, strong anti-interference ability and good synchronization, which plays an important role in ensuring the stable operation of the power grid.
Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery.
Qiu, Chunping; Schmitt, Michael; Zhu, Xiao Xiang
2018-04-01
In this paper we discuss the potential and challenges regarding SAR-optical stereogrammetry for urban areas, using very-high-resolution (VHR) remote sensing imagery. Since we do this mainly from a geometrical point of view, we first analyze the height reconstruction accuracy to be expected for different stereogrammetric configurations. Then, we propose a strategy for simultaneous tie point matching and 3D reconstruction, which exploits an epipolar-like search window constraint. To drive the matching and ensure some robustness, we combine different established hand-crafted similarity measures. For the experiments, we use real test data acquired by the Worldview-2, TerraSAR-X and MEMPHIS sensors. Our results show that SAR-optical stereogrammetry using VHR imagery is generally feasible with 3D positioning accuracies in the meter-domain, although the matching of these strongly hetereogeneous multi-sensor data remains very challenging.
2007-12-01
Treasury instruments of the United States and these earning grow the fund’s balance.66 The Corporation was appropriated separate funds to pay for...in fiscal 2003 data quality was an important issue. The Corporation is highly decentralized and relies on data flow to ensure accuracy of financial...information. Information necessarily flows from the Corporation to its grantees and from grantees to the Corporation. Information on new volunteers
NASA Astrophysics Data System (ADS)
Akilan, A.; Nagasubramanian, V.; Chaudhry, A.; Reddy, D. Rajesh; Sudheer Reddy, D.; Usha Devi, R.; Tirupati, T.; Radhadevi, P. V.; Varadan, G.
2014-11-01
Block Adjustment is a technique for large area mapping for images obtained from different remote sensingsatellites.The challenge in this process is to handle huge number of satellite imageries from different sources with different resolution and accuracies at the system level. This paper explains a system with various tools and techniques to effectively handle the end-to-end chain in large area mapping and production with good level of automation and the provisions for intuitive analysis of final results in 3D and 2D environment. In addition, the interface for using open source ortho and DEM references viz., ETM, SRTM etc. and displaying ESRI shapes for the image foot-prints are explained. Rigorous theory, mathematical modelling, workflow automation and sophisticated software engineering tools are included to ensure high photogrammetric accuracy and productivity. Major building blocks like Georeferencing, Geo-capturing and Geo-Modelling tools included in the block adjustment solution are explained in this paper. To provide optimal bundle block adjustment solution with high precision results, the system has been optimized in many stages to exploit the full utilization of hardware resources. The robustness of the system is ensured by handling failure in automatic procedure and saving the process state in every stage for subsequent restoration from the point of interruption. The results obtained from various stages of the system are presented in the paper.
Total Water Content Measurements with an Isokinetic Sampling Probe
NASA Technical Reports Server (NTRS)
Reehorst, Andrew L.; Miller, Dean R.; Bidwell, Colin S.
2010-01-01
The NASA Glenn Research Center has developed a Total Water Content (TWC) Isokinetic Sampling Probe. Since it is not sensitive to cloud water particle phase nor size, it is particularly attractive to support super-cooled large droplet and high ice water content aircraft icing studies. The instrument is comprised of the Sampling Probe, Sample Flow Control, and Water Vapor Measurement subsystems. Analysis and testing have been conducted on the subsystems to ensure their proper function and accuracy. End-to-end bench testing has also been conducted to ensure the reliability of the entire instrument system. A Stokes Number based collection efficiency correction was developed to correct for probe thickness effects. The authors further discuss the need to ensure that no condensation occurs within the instrument plumbing. Instrument measurements compared to facility calibrations from testing in the NASA Glenn Icing Research Tunnel are presented and discussed. There appears to be liquid water content and droplet size effects in the differences between the two measurement techniques.
40 CFR 86.1338-84 - Emission measurement accuracy.
Code of Federal Regulations, 2010 CFR
2010-07-01
... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...
ERIC Educational Resources Information Center
Feedback, 1984
1984-01-01
This issue of FEEDBACK, a newsletter produced by the the Austin Independent School District Office of Research and Evaluation (ORE), illustrates the accuracy, validity, and fairness of ORE reports. The independence of the reports is explained. Internal and external quality controls are used to ensure reliability and accuracy of the reports.…
The uncertainty of crop yield projections is reduced by improved temperature response functions
USDA-ARS?s Scientific Manuscript database
Increasing the accuracy of crop productivity estimates is a key Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on cr...
40 CFR 86.1338-84 - Emission measurement accuracy.
Code of Federal Regulations, 2013 CFR
2013-07-01
... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...
40 CFR 86.1338-84 - Emission measurement accuracy.
Code of Federal Regulations, 2012 CFR
2012-07-01
... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...
40 CFR 86.1338-84 - Emission measurement accuracy.
Code of Federal Regulations, 2011 CFR
2011-07-01
... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...
Engelberg, Jesse A; Retallack, Hanna; Balassanian, Ronald; Dowsett, Mitchell; Zabaglo, Lila; Ram, Arishneel A; Apple, Sophia K; Bishop, John W; Borowsky, Alexander D; Carpenter, Philip M; Chen, Yunn-Yi; Datnow, Brian; Elson, Sarah; Hasteh, Farnaz; Lin, Fritz; Moatamed, Neda A; Zhang, Yanhong; Cardiff, Robert D
2015-11-01
Hormone receptor status is an integral component of decision-making in breast cancer management. IHC4 score is an algorithm that combines hormone receptor, HER2, and Ki-67 status to provide a semiquantitative prognostic score for breast cancer. High accuracy and low interobserver variance are important to ensure the score is accurately calculated; however, few previous efforts have been made to measure or decrease interobserver variance. We developed a Web-based training tool, called "Score the Core" (STC) using tissue microarrays to train pathologists to visually score estrogen receptor (using the 300-point H score), progesterone receptor (percent positive), and Ki-67 (percent positive). STC used a reference score calculated from a reproducible manual counting method. Pathologists in the Athena Breast Health Network and pathology residents at associated institutions completed the exercise. By using STC, pathologists improved their estrogen receptor H score and progesterone receptor and Ki-67 proportion assessment and demonstrated a good correlation between pathologist and reference scores. In addition, we collected information about pathologist performance that allowed us to compare individual pathologists and measures of agreement. Pathologists' assessment of the proportion of positive cells was closer to the reference than their assessment of the relative intensity of positive cells. Careful training and assessment should be used to ensure the accuracy of breast biomarkers. This is particularly important as breast cancer diagnostics become increasingly quantitative and reproducible. Our training tool is a novel approach for pathologist training that can serve as an important component of ongoing quality assessment and can improve the accuracy of breast cancer prognostic biomarkers. Copyright © 2015 Elsevier Inc. All rights reserved.
Nallasivan, S; Gillott, T; Kamath, S; Blow, L; Goddard, V
2011-06-01
Record Keeping Standards is a development led by the Royal College of Physicians of London (RCP) Health Informatics Unit and funded by the National Health Service (NHS) Connecting for Health. A supplementary report produced by the RCP makes a number of recommendations based on a study held at an acute hospital trust. We audited the medical notes and coding to assess the accuracy, documentation by the junior doctors and also to correlate our findings with the RCP audit. Northern Lincolnshire & Goole Hospitals NHS Foundation Trust has 114,000 'finished consultant episodes' per year. A total of 100 consecutive medical (50) and rheumatology (50) discharges from Diana Princess of Wales Hospital from August-October 2009 were reviewed. The results showed an improvement in coding accuracy (10% errors), comparable to the RCP audit but with 5% documentation errors. Physician involvement needs enhancing to improve the effectiveness and to ensure clinical safety.
Electrocardiogram transmission - The state of the art.
NASA Technical Reports Server (NTRS)
Firstenberg, A.; Huston, S. W.; Olsen, D. E.; Hahn, P. M.
1971-01-01
A comparative analysis of available clinical EKG telemetry systems was conducted. Although present day electrocardiogram diagnosis requires a high degree of measurement accuracy, there exists wide variations in the performance characteristics of the various telemeters marketed today necessitating careful consideration of specifications prior to procurement. The authors have endeavored to provide the physicians with a clear understanding, in terms of the effects on the electrocardiogram, of the factors he must evaluate in order to ensure high fidelity EKG reproduction. A tabulation of comparative parameter values for each unit obtained from manufacturers' specifications and substantiated by standardized performance tests conducted in our laboratory is presented.
Haworth, Annette; Kearvell, Rachel; Greer, Peter B; Hooton, Ben; Denham, James W; Lamb, David; Duchesne, Gillian; Murray, Judy; Joseph, David
2009-03-01
A multi-centre clinical trial for prostate cancer patients provided an opportunity to introduce conformal radiotherapy with dose escalation. To verify adequate treatment accuracy prior to patient recruitment, centres submitted details of a set-up accuracy study (SUAS). We report the results of the SUAS, the variation in clinical practice and the strategies used to help centres improve treatment accuracy. The SUAS required each of the 24 participating centres to collect data on at least 10 pelvic patients imaged on a minimum of 20 occasions. Software was provided for data collection and analysis. Support to centres was provided through educational lectures, the trial quality assurance team and an information booklet. Only two centres had recently carried out a SUAS prior to the trial opening. Systematic errors were generally smaller than those previously reported in the literature. The questionnaire identified many differences in patient set-up protocols. As a result of participating in this QA activity more than 65% of centres improved their treatment delivery accuracy. Conducting a pre-trial SUAS has led to improvement in treatment delivery accuracy in many centres. Treatment techniques and set-up accuracy varied greatly, demonstrating a need to ensure an on-going awareness for such studies in future trials and with the introduction of dose escalation or new technologies.
"Small Talk": Developing Fluency, Accuracy, and Complexity in Speaking
ERIC Educational Resources Information Center
Hunter, James
2012-01-01
A major issue that continues to challenge language teachers is how to ensure that learners develop accuracy and complexity in their speaking, as well as fluency. Teachers know that too much corrective feedback (CF) can make learners reluctant to speak, while not enough may allow their errors to become entrenched. Furthermore, there is controversy…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-02
...'s CMRS E911 location requirements without ensuring that time is taken to study location technologies... accuracy requirements on interconnected VoIP service without further study.'' A number of commenters... study the technical, operational and economic issues related to the provision of ALI for interconnected...
NASA Astrophysics Data System (ADS)
Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei
2015-04-01
Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.
Assessing and Ensuring GOES-R Magnetometer Accuracy
NASA Technical Reports Server (NTRS)
Kronenwetter, Jeffrey; Carter, Delano R.; Todirita, Monica; Chu, Donald
2016-01-01
The GOES-R magnetometer accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. To achieve this, the sensor itself has better than 1 nT accuracy. Because zero offset and scale factor drift over time, it is also necessary to perform annual calibration maneuvers. To predict performance, we used covariance analysis and attempted to corroborate it with simulations. Although not perfect, the two generally agree and show the expected behaviors. With the annual calibration regimen, these predictions suggest that the magnetometers will meet their accuracy requirements.
On a fast calculation of structure factors at a subatomic resolution.
Afonine, P V; Urzhumtsev, A
2004-01-01
In the last decade, the progress of protein crystallography allowed several protein structures to be solved at a resolution higher than 0.9 A. Such studies provide researchers with important new information reflecting very fine structural details. The signal from these details is very weak with respect to that corresponding to the whole structure. Its analysis requires high-quality data, which previously were available only for crystals of small molecules, and a high accuracy of calculations. The calculation of structure factors using direct formulae, traditional for 'small-molecule' crystallography, allows a relatively simple accuracy control. For macromolecular crystals, diffraction data sets at a subatomic resolution contain hundreds of thousands of reflections, and the number of parameters used to describe the corresponding models may reach the same order. Therefore, the direct way of calculating structure factors becomes very time expensive when applied to large molecules. These problems of high accuracy and computational efficiency require a re-examination of computer tools and algorithms. The calculation of model structure factors through an intermediate generation of an electron density [Sayre (1951). Acta Cryst. 4, 362-367; Ten Eyck (1977). Acta Cryst. A33, 486-492] may be much more computationally efficient, but contains some parameters (grid step, 'effective' atom radii etc.) whose influence on the accuracy of the calculation is not straightforward. At the same time, the choice of parameters within safety margins that largely ensure a sufficient accuracy may result in a significant loss of the CPU time, making it close to the time for the direct-formulae calculations. The impact of the different parameters on the computer efficiency of structure-factor calculation is studied. It is shown that an appropriate choice of these parameters allows the structure factors to be obtained with a high accuracy and in a significantly shorter time than that required when using the direct formulae. Practical algorithms for the optimal choice of the parameters are suggested.
NASA Astrophysics Data System (ADS)
Jayasekare, Ajith S.; Wickramasuriya, Rohan; Namazi-Rad, Mohammad-Reza; Perez, Pascal; Singh, Gaurav
2017-07-01
A continuous update of building information is necessary in today's urban planning. Digital images acquired by remote sensing platforms at appropriate spatial and temporal resolutions provide an excellent data source to achieve this. In particular, high-resolution satellite images are often used to retrieve objects such as rooftops using feature extraction. However, high-resolution images acquired over built-up areas are associated with noises such as shadows that reduce the accuracy of feature extraction. Feature extraction heavily relies on the reflectance purity of objects, which is difficult to perfect in complex urban landscapes. An attempt was made to increase the reflectance purity of building rooftops affected by shadows. In addition to the multispectral (MS) image, derivatives thereof namely, normalized difference vegetation index and principle component (PC) images were incorporated in generating the probability image. This hybrid probability image generation ensured that the effect of shadows on rooftop extraction, particularly on light-colored roofs, is largely eliminated. The PC image was also used for image segmentation, which further increased the accuracy compared to segmentation performed on an MS image. Results show that the presented method can achieve higher rooftop extraction accuracy (70.4%) in vegetation-rich urban areas compared to traditional methods.
A comparison study between MLP and convolutional neural network models for character recognition
NASA Astrophysics Data System (ADS)
Ben Driss, S.; Soua, M.; Kachouri, R.; Akil, M.
2017-05-01
Optical Character Recognition (OCR) systems have been designed to operate on text contained in scanned documents and images. They include text detection and character recognition in which characters are described then classified. In the classification step, characters are identified according to their features or template descriptions. Then, a given classifier is employed to identify characters. In this context, we have proposed the unified character descriptor (UCD) to represent characters based on their features. Then, matching was employed to ensure the classification. This recognition scheme performs a good OCR Accuracy on homogeneous scanned documents, however it cannot discriminate characters with high font variation and distortion.3 To improve recognition, classifiers based on neural networks can be used. The multilayer perceptron (MLP) ensures high recognition accuracy when performing a robust training. Moreover, the convolutional neural network (CNN), is gaining nowadays a lot of popularity for its high performance. Furthermore, both CNN and MLP may suffer from the large amount of computation in the training phase. In this paper, we establish a comparison between MLP and CNN. We provide MLP with the UCD descriptor and the appropriate network configuration. For CNN, we employ the convolutional network designed for handwritten and machine-printed character recognition (Lenet-5) and we adapt it to support 62 classes, including both digits and characters. In addition, GPU parallelization is studied to speed up both of MLP and CNN classifiers. Based on our experimentations, we demonstrate that the used real-time CNN is 2x more relevant than MLP when classifying characters.
Energy calibration of CALET onboard the International Space Station
NASA Astrophysics Data System (ADS)
Asaoka, Y.; Akaike, Y.; Komiya, Y.; Miyata, R.; Torii, S.; Adriani, O.; Asano, K.; Bagliesi, M. G.; Bigongiari, G.; Binns, W. R.; Bonechi, S.; Bongi, M.; Brogi, P.; Buckley, J. H.; Cannady, N.; Castellini, G.; Checchia, C.; Cherry, M. L.; Collazuol, G.; Di Felice, V.; Ebisawa, K.; Fuke, H.; Guzik, T. G.; Hams, T.; Hareyama, M.; Hasebe, N.; Hibino, K.; Ichimura, M.; Ioka, K.; Ishizaki, W.; Israel, M. H.; Javaid, A.; Kasahara, K.; Kataoka, J.; Kataoka, R.; Katayose, Y.; Kato, C.; Kawanaka, N.; Kawakubo, Y.; Kitamura, H.; Krawczynski, H. S.; Krizmanic, J. F.; Kuramata, S.; Lomtadze, T.; Maestro, P.; Marrocchesi, P. S.; Messineo, A. M.; Mitchell, J. W.; Miyake, S.; Mizutani, K.; Moiseev, A. A.; Mori, K.; Mori, M.; Mori, N.; Motz, H. M.; Munakata, K.; Murakami, H.; Nakagawa, Y. E.; Nakahira, S.; Nishimura, J.; Okuno, S.; Ormes, J. F.; Ozawa, S.; Pacini, L.; Palma, F.; Papini, P.; Penacchioni, A. V.; Rauch, B. F.; Ricciarini, S.; Sakai, K.; Sakamoto, T.; Sasaki, M.; Shimizu, Y.; Shiomi, A.; Sparvoli, R.; Spillantini, P.; Stolzi, F.; Takahashi, I.; Takayanagi, M.; Takita, M.; Tamura, T.; Tateyama, N.; Terasawa, T.; Tomida, H.; Tsunesada, Y.; Uchihori, Y.; Ueno, S.; Vannuccini, E.; Wefel, J. P.; Yamaoka, K.; Yanagita, S.; Yoshida, A.; Yoshida, K.; Yuda, T.
2017-05-01
In August 2015, the CALorimetric Electron Telescope (CALET), designed for long exposure observations of high energy cosmic rays, docked with the International Space Station (ISS) and shortly thereafter began to collect data. CALET will measure the cosmic ray electron spectrum over the energy range of 1 GeV to 20 TeV with a very high resolution of 2% above 100 GeV, based on a dedicated instrument incorporating an exceptionally thick 30 radiation-length calorimeter with both total absorption and imaging (TASC and IMC) units. Each TASC readout channel must be carefully calibrated over the extremely wide dynamic range of CALET that spans six orders of magnitude in order to obtain a degree of calibration accuracy matching the resolution of energy measurements. These calibrations consist of calculating the conversion factors between ADC units and energy deposits, ensuring linearity over each gain range, and providing a seamless transition between neighboring gain ranges. This paper describes these calibration methods in detail, along with the resulting data and associated accuracies. The results presented in this paper show that a sufficient accuracy was achieved for the calibrations of each channel in order to obtain a suitable resolution over the entire dynamic range of the electron spectrum measurement.
Contact-aware simulations of particulate Stokesian suspensions
NASA Astrophysics Data System (ADS)
Lu, Libin; Rahimian, Abtin; Zorin, Denis
2017-10-01
We present an efficient, accurate, and robust method for simulation of dense suspensions of deformable and rigid particles immersed in Stokesian fluid in two dimensions. We use a well-established boundary integral formulation for the problem as the foundation of our approach. This type of formulation, with a high-order spatial discretization and an implicit and adaptive time discretization, have been shown to be able to handle complex interactions between particles with high accuracy. Yet, for dense suspensions, very small time-steps or expensive implicit solves as well as a large number of discretization points are required to avoid non-physical contact and intersections between particles, leading to infinite forces and numerical instability. Our method maintains the accuracy of previous methods at a significantly lower cost for dense suspensions. The key idea is to ensure interference-free configuration by introducing explicit contact constraints into the system. While such constraints are unnecessary in the formulation, in the discrete form of the problem, they make it possible to eliminate catastrophic loss of accuracy by preventing contact explicitly. Introducing contact constraints results in a significant increase in stable time-step size for explicit time-stepping, and a reduction in the number of points adequate for stability.
NASA Astrophysics Data System (ADS)
Ding, Xiang; Li, Fei; Zhang, Jiyan; Liu, Wenli
2016-10-01
Raman spectrometers are usually calibrated periodically to ensure their measurement accuracy of Raman shift. A combination of a piece of monocrystalline silicon chip and a low pressure discharge lamp is proposed as a candidate for the reference standard of Raman shift. A high precision calibration technique is developed to accurately determine the standard value of the silicon's Raman shift around 520cm-1. The technique is described and illustrated by measuring a piece of silicon chip against three atomic spectral lines of a neon lamp. A commercial Raman spectrometer is employed and its error characteristics of Raman shift are investigated. Error sources are evaluated based on theoretical analysis and experiments, including the sample factor, the instrumental factor, the laser factor and random factors. Experimental results show that the expanded uncertainty of the silicon's Raman shift around 520cm-1 can acheive 0.3 cm-1 (k=2), which is more accurate than most of currently used reference materials. The results are validated by comparison measurement between three Raman spectrometers. It is proved that the technique can remarkably enhance the accuracy of Raman shift, making it possible to use the silicon and the lamp to calibrate Raman spectrometers.
Optimized star sensors laboratory calibration method using a regularization neural network.
Zhang, Chengfen; Niu, Yanxiong; Zhang, Hao; Lu, Jiazhen
2018-02-10
High-precision ground calibration is essential to ensure the performance of star sensors. However, the complex distortion and multi-error coupling have brought great difficulties to traditional calibration methods, especially for large field of view (FOV) star sensors. Although increasing the complexity of models is an effective way to improve the calibration accuracy, it significantly increases the demand for calibration data. In order to achieve high-precision calibration of star sensors with large FOV, a novel laboratory calibration method based on a regularization neural network is proposed. A multi-layer structure neural network is designed to represent the mapping of the star vector and the corresponding star point coordinate directly. To ensure the generalization performance of the network, regularization strategies are incorporated into the net structure and the training algorithm. Simulation and experiment results demonstrate that the proposed method can achieve high precision with less calibration data and without any other priori information. Compared with traditional methods, the calibration error of the star sensor decreased by about 30%. The proposed method can satisfy the precision requirement for large FOV star sensors.
Code of Federal Regulations, 2014 CFR
2014-10-01
... comprehensive Tribal IV-D agencies must have in place to ensure the security and privacy of Computerized Tribal... ensure the security and privacy of Computerized Tribal IV-D Systems and Office Automation? (a..., accuracy, completeness, access to, and use of data in the Computerized Tribal IV-D System and Office...
Impact of Menu Sequencing on Internet-Based Educational Module Selection
ERIC Educational Resources Information Center
Bensley, Robert; Brusk, John J.; Rivas, Jason; Anderson, Judith V.
2006-01-01
Patterns of Internet-based menu item selection can occur for a number of reasons, many of which may not be based on interest in topic. It then becomes important to ensure menu order is devised in a way that ensures the greatest accuracy in matching user need with selection. This study examined the impact of menu rotation on the selection of…
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2017-09-01
Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.
NASA Astrophysics Data System (ADS)
Li, Xingchang; Zhang, Zhiyu; Hu, Haifei; Li, Yingjie; Xiong, Ling; Zhang, Xuejun; Yan, Jiwang
2018-04-01
On-machine measurements can improve the form accuracy of optical surfaces in single-point diamond turning applications; however, commercially available linear variable differential transformer sensors are inaccurate and can potentially scratch the surface. We present an on-machine measurement system based on capacitive displacement sensors for high-precision optical surfaces. In the proposed system, a position-trigger method of measurement was developed to ensure strict correspondence between the measurement points and the measurement data with no intervening time-delay. In addition, a double-sensor measurement was proposed to reduce the electric signal noise during spindle rotation. Using the proposed system, the repeatability of 80-nm peak-to-valley (PV) and 8-nm root-mean-square (RMS) was achieved through analyzing four successive measurement results. The accuracy of 109-nm PV and 14-nm RMS was obtained by comparing with the interferometer measurement result. An aluminum spherical mirror with a diameter of 300 mm was fabricated, and the resulting measured form error after one compensation cut was decreased to 254 nm in PV and 52 nm in RMS. These results confirm that the measurements of the surface form errors were successfully used to modify the cutting tool path during the compensation cut, thereby ensuring that the diamond turning process was more deterministic. In addition, the results show that the noise level was significantly reduced with the reference sensor even under a high rotational speed.
NASA Astrophysics Data System (ADS)
Smith, J.; Gambacorta, A.; Barnet, C.; Smith, N.; Goldberg, M.; Pierce, B.; Wolf, W.; King, T.
2016-12-01
This work presents an overview of the NPP and J1 CrIS high resolution operational channel selection. Our methodology focuses on the spectral sensitivity characteristics of the available channels in order to maximize information content and spectral purity. These aspects are key to ensure accuracy in the retrieval products, particularly for trace gases. We will provide a demonstration of its global optimality by analyzing different test cases that are of particular interests to our JPSS Proving Ground and Risk Reduction user applications. A focus will be on high resolution trace gas retrieval capability in the context of the Alaska fire initiatives.
NASA Astrophysics Data System (ADS)
Kubalska, J. L.; Preuss, R.
2013-12-01
Digital Surface Models (DSM) are used in GIS data bases as single product more often. They are also necessary to create other products such as3D city models, true-ortho and object-oriented classification. This article presents results of DSM generation for classification of vegetation in urban areas. Source data allowed producing DSM with using of image matching method and ALS data. The creation of DSM from digital images, obtained by Ultra Cam-D digital Vexcel camera, was carried out in Match-T by INPHO. This program optimizes the configuration of images matching process, which ensures high accuracy and minimize gap areas. The analysis of the accuracy of this process was made by comparison of DSM generated in Match-T with DSM generated from ALS data. Because of further purpose of generated DSM it was decided to create model in GRID structure with cell size of 1 m. With this parameter differential model from both DSMs was also built that allowed determining the relative accuracy of the compared models. The analysis indicates that the generation of DSM with multi-image matching method is competitive for the same surface model creation from ALS data. Thus, when digital images with high overlap are available, the additional registration of ALS data seems to be unnecessary.
Zhao, Yinzhi; Zhang, Peng; Guo, Jiming; Li, Xin; Wang, Jinling; Yang, Fei; Wang, Xinzhe
2018-06-20
Due to the great influence of multipath effect, noise, clock and error on pseudorange, the carrier phase double difference equation is widely used in high-precision indoor pseudolite positioning. The initial position is determined mostly by the known point initialization (KPI) method, and then the ambiguities can be fixed with the LAMBDA method. In this paper, a new method without using the KPI to achieve high-precision indoor pseudolite positioning is proposed. The initial coordinates can be quickly obtained to meet the accuracy requirement of the indoor LAMBDA method. The detailed processes of the method follows: Aiming at the low-cost single-frequency pseudolite system, the static differential pseudolite system (DPL) method is used to obtain the low-accuracy positioning coordinates of the rover station quickly. Then, the ambiguity function method (AFM) is used to search for the coordinates in the corresponding epoch. The real coordinates obtained by AFM can meet the initial accuracy requirement of the LAMBDA method, so that the double difference carrier phase ambiguities can be correctly fixed. Following the above steps, high-precision indoor pseudolite positioning can be realized. Several experiments, including static and dynamic tests, are conducted to verify the feasibility of the new method. According to the results of the experiments, the initial coordinates with the accuracy of decimeter level through the DPL can be obtained. For the AFM part, both a one-meter search scope and two-centimeter or four-centimeter search steps are used to ensure the precision at the centimeter level and high search efficiency. After dealing with the problem of multiple peaks caused by the ambiguity cosine function, the coordinate information of the maximum ambiguity function value (AFV) is taken as the initial value of the LAMBDA, and the ambiguities can be fixed quickly. The new method provides accuracies at the centimeter level for dynamic experiments and at the millimeter level for static ones.
Assessing and Ensuring GOES-R Magnetometer Accuracy
NASA Technical Reports Server (NTRS)
Carter, Delano R.; Todirita, Monica; Kronenwetter, Jeffrey; Chu, Donald
2016-01-01
The GOES-R magnetometer subsystem accuracy requirement is 1.7 nanoteslas (nT). During quiet times (100 nT), accuracy is defined as absolute mean plus 3 sigma. During storms (300 nT), accuracy is defined as absolute mean plus 2 sigma. Error comes both from outside the magnetometers, e.g. spacecraft fields and misalignments, as well as inside, e.g. zero offset and scale factor errors. Because zero offset and scale factor drift over time, it will be necessary to perform annual calibration maneuvers. To predict performance before launch, we have used Monte Carlo simulations and covariance analysis. Both behave as expected, and their accuracy predictions agree within 30%. With the proposed calibration regimen, both suggest that the GOES-R magnetometer subsystem will meet its accuracy requirements.
The utility of low-density genotyping for imputation in the Thoroughbred horse
2014-01-01
Background Despite the dramatic reduction in the cost of high-density genotyping that has occurred over the last decade, it remains one of the limiting factors for obtaining the large datasets required for genomic studies of disease in the horse. In this study, we investigated the potential for low-density genotyping and subsequent imputation to address this problem. Results Using the haplotype phasing and imputation program, BEAGLE, it is possible to impute genotypes from low- to high-density (50K) in the Thoroughbred horse with reasonable to high accuracy. Analysis of the sources of variation in imputation accuracy revealed dependence both on the minor allele frequency of the single nucleotide polymorphisms (SNPs) being imputed and on the underlying linkage disequilibrium structure. Whereas equidistant spacing of the SNPs on the low-density panel worked well, optimising SNP selection to increase their minor allele frequency was advantageous, even when the panel was subsequently used in a population of different geographical origin. Replacing base pair position with linkage disequilibrium map distance reduced the variation in imputation accuracy across SNPs. Whereas a 1K SNP panel was generally sufficient to ensure that more than 80% of genotypes were correctly imputed, other studies suggest that a 2K to 3K panel is more efficient to minimize the subsequent loss of accuracy in genomic prediction analyses. The relationship between accuracy and genotyping costs for the different low-density panels, suggests that a 2K SNP panel would represent good value for money. Conclusions Low-density genotyping with a 2K SNP panel followed by imputation provides a compromise between cost and accuracy that could promote more widespread genotyping, and hence the use of genomic information in horses. In addition to offering a low cost alternative to high-density genotyping, imputation provides a means to combine datasets from different genotyping platforms, which is becoming necessary since researchers are starting to use the recently developed equine 70K SNP chip. However, more work is needed to evaluate the impact of between-breed differences on imputation accuracy. PMID:24495673
NASA Technical Reports Server (NTRS)
Strapp, John W.; Lilie, Lyle E.; Ratvasky, Thomas P.; Davison, Craig R.; Dumont, Christopher J.
2016-01-01
A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics. The article tracks the testing and modifications of the IKP2 probe to ensure its readiness for three flight campaigns in 2014 and 2015. Comparisons are made between the IKP2 and the NASA Icing Research Tunnel reference values in liquid conditions, and to an exploratory technique estimating ice water content from a bulk ice capture cylinder method in glaciated conditions. These comparisons suggest that the initial target of 20 percent accuracy in TWC has been achieved and likely exceeded for tested TWC values in excess of about 0.5 gm (exp -3). Uncertainties in the ice water content reference method have been identified. Complications are introduced in the necessary subtraction of an independently measured background water vapour concentration, errors of which are small at the colder flight temperatures, but increase rapidly with increasing temperature, and ultimately limit the practical use of the instrument in a tropical convective atmosphere to conditions colder than about 0 degrees C. A companion article in this conference traces the accuracy of the components of the IKP2 to derive estimated system accuracy.
NASA Technical Reports Server (NTRS)
Strapp, J. Walter; Lilie, Lyle E.; Ratvasky, Thomas P.; Davison, Craig; Dumont, Chris
2016-01-01
A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics. The article tracks the testing and modifications of the IKP2 probe to ensure its readiness for three flight campaigns in 2014 and 2015. Comparisons are made between the IKP2 and the NASA Icing Research Tunnel reference values in liquid conditions, and to an exploratory technique estimating ice water content from a bulk ice capture cylinder method in glaciated conditions. These comparisons suggest that the initial target of 20% accuracy in TWC has been achieved and likely exceeded for tested TWC values in excess of about 0.5/cu gm. Uncertainties in the ice water content reference method have been identified. Complications are introduced in the necessary subtraction of an independently measured background water vapor concentration, errors of which are small at the colder flight temperatures, but increase rapidly with increasing temperature, and ultimately limit the practical use of the instrument in a tropical convective atmosphere to conditions colder than about 0 C. A companion article in this conference traces the accuracy of the components of the IKP2 to derive estimated system accuracy.
Design and calibration of a novel transient radiative heat flux meter for a spacecraft thermal test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheng, Chunchen; Hu, Peng, E-mail: hupeng@ustc.edu.cn; Cheng, Xiaofang
2016-06-15
Radiative heat flux measurement is significantly important for a spacecraft thermal test. To satisfy the requirements of both high accuracy and fast response, a novel transient radiative heat flux meter was developed. Its thermal receiver consists of a central thermal receiver and two thermal guarded annular plates, which ensure the temperature distribution of the central thermal receiver to be uniform enough for reasonably applying lumped heat capacity method in a transient radiative heat flux measurement. This novel transient radiative heat flux meter design can also take accurate measurements regardless of spacecraft surface temperature and incident radiation spectrum. The measurement principlemore » was elaborated and the coefficients were calibrated. Experimental results from testing a blackbody furnace and an Xenon lamp show that this novel transient radiative heat flux meter can be used to measure transient radiative heat flux up to 1400 W/m{sup 2} with high accuracy and the response time of less than 10 s.« less
Relevance of deep learning to facilitate the diagnosis of HER2 status in breast cancer
NASA Astrophysics Data System (ADS)
Vandenberghe, Michel E.; Scott, Marietta L. J.; Scorer, Paul W.; Söderberg, Magnus; Balcerzak, Denis; Barker, Craig
2017-04-01
Tissue biomarker scoring by pathologists is central to defining the appropriate therapy for patients with cancer. Yet, inter-pathologist variability in the interpretation of ambiguous cases can affect diagnostic accuracy. Modern artificial intelligence methods such as deep learning have the potential to supplement pathologist expertise to ensure constant diagnostic accuracy. We developed a computational approach based on deep learning that automatically scores HER2, a biomarker that defines patient eligibility for anti-HER2 targeted therapies in breast cancer. In a cohort of 71 breast tumour resection samples, automated scoring showed a concordance of 83% with a pathologist. The twelve discordant cases were then independently reviewed, leading to a modification of diagnosis from initial pathologist assessment for eight cases. Diagnostic discordance was found to be largely caused by perceptual differences in assessing HER2 expression due to high HER2 staining heterogeneity. This study provides evidence that deep learning aided diagnosis can facilitate clinical decision making in breast cancer by identifying cases at high risk of misdiagnosis.
Development and Validation of a Kit to Measure Drink Antioxidant Capacity Using a Novel Colorimeter.
Priftis, Alexandros; Stagos, Dimitrios; Tzioumakis, Nikolaos; Konstantinopoulos, Konstantinos; Patouna, Anastasia; Papadopoulos, Georgios E; Tsatsakis, Aristides; Kouretas, Dimitrios
2016-08-30
Measuring the antioxidant capacity of foods is essential, as a means of quality control to ensure that the final product reaching the consumer will be of high standards. Despite the already existing assays with which the antioxidant activity is estimated, new, faster and low cost methods are always sought. Therefore, we have developed a novel colorimeter and combined it with a slightly modified DPPH assay, thus creating a kit that can assess the antioxidant capacity of liquids (e.g., different types of coffee, beer, wine, juices) in a quite fast and low cost manner. The accuracy of the colorimeter was ensured by comparing it to a fully validated Hitachi U-1900 spectrophotometer, and a coefficient was calculated to eliminate the observed differences. In addition, a new, user friendly software was developed, in order to render the procedure as easy as possible, while allowing a central monitoring of the obtained results. Overall, a novel kit was developed, with which the antioxidant activity of liquids can be measured, firstly to ensure their quality and secondly to assess the amount of antioxidants consumed with the respective food.
Rigorous Training of Dogs Leads to High Accuracy in Human Scent Matching-To-Sample Performance
Marchal, Sophie; Bregeras, Olivier; Puaux, Didier; Gervais, Rémi; Ferry, Barbara
2016-01-01
Human scent identification is based on a matching-to-sample task in which trained dogs are required to compare a scent sample collected from an object found at a crime scene to that of a suspect. Based on dogs’ greater olfactory ability to detect and process odours, this method has been used in forensic investigations to identify the odour of a suspect at a crime scene. The excellent reliability and reproducibility of the method largely depend on rigor in dog training. The present study describes the various steps of training that lead to high sensitivity scores, with dogs matching samples with 90% efficiency when the complexity of the scents presented during the task in the sample is similar to that presented in the in lineups, and specificity reaching a ceiling, with no false alarms in human scent matching-to-sample tasks. This high level of accuracy ensures reliable results in judicial human scent identification tests. Also, our data should convince law enforcement authorities to use these results as official forensic evidence when dogs are trained appropriately. PMID:26863620
A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery.
Huang, Huasheng; Deng, Jizhong; Lan, Yubin; Yang, Aqing; Deng, Xiaoling; Zhang, Lei
2018-01-01
Appropriate Site Specific Weed Management (SSWM) is crucial to ensure the crop yields. Within SSWM of large-scale area, remote sensing is a key technology to provide accurate weed distribution information. Compared with satellite and piloted aircraft remote sensing, unmanned aerial vehicle (UAV) is capable of capturing high spatial resolution imagery, which will provide more detailed information for weed mapping. The objective of this paper is to generate an accurate weed cover map based on UAV imagery. The UAV RGB imagery was collected in 2017 October over the rice field located in South China. The Fully Convolutional Network (FCN) method was proposed for weed mapping of the collected imagery. Transfer learning was used to improve generalization capability, and skip architecture was applied to increase the prediction accuracy. After that, the performance of FCN architecture was compared with Patch_based CNN algorithm and Pixel_based CNN method. Experimental results showed that our FCN method outperformed others, both in terms of accuracy and efficiency. The overall accuracy of the FCN approach was up to 0.935 and the accuracy for weed recognition was 0.883, which means that this algorithm is capable of generating accurate weed cover maps for the evaluated UAV imagery.
Application of particle splitting method for both hydrostatic and hydrodynamic cases in SPH
NASA Astrophysics Data System (ADS)
Liu, W. T.; Sun, P. N.; Ming, F. R.; Zhang, A. M.
2018-01-01
Smoothed particle hydrodynamics (SPH) method with numerical diffusive terms shows satisfactory stability and accuracy in some violent fluid-solid interaction problems. However, in most simulations, uniform particle distributions are used and the multi-resolution, which can obviously improve the local accuracy and the overall computational efficiency, has seldom been applied. In this paper, a dynamic particle splitting method is applied and it allows for the simulation of both hydrostatic and hydrodynamic problems. The splitting algorithm is that, when a coarse (mother) particle enters the splitting region, it will be split into four daughter particles, which inherit the physical parameters of the mother particle. In the particle splitting process, conservations of mass, momentum and energy are ensured. Based on the error analysis, the splitting technique is designed to allow the optimal accuracy at the interface between the coarse and refined particles and this is particularly important in the simulation of hydrostatic cases. Finally, the scheme is validated by five basic cases, which demonstrate that the present SPH model with a particle splitting technique is of high accuracy and efficiency and is capable for the simulation of a wide range of hydrodynamic problems.
ERIC Educational Resources Information Center
Akbarzadeh, Roya; Saeidi, Mahnaz; Chehreh, Mahtaj
2014-01-01
The role of teacher-student interaction and collaboration in solving linguistic problems has recently been in the center of SLA research. Accordingly, this study investigated the effect of Oral Interactive Feedback (OIF) on the accuracy and complexity of Iranian intermediate EFL learners' writing. After ensuring the homogeneity using Preliminary…
Wilson, Gary L.; Richards, Joseph M.
2006-01-01
Because of the increasing use and importance of lakes for water supply to communities, a repeatable and reliable procedure to determine lake bathymetry and capacity is needed. A method to determine the accuracy of the procedure will help ensure proper collection and use of the data and resulting products. It is important to clearly define the intended products and desired accuracy before conducting the bathymetric survey to ensure proper data collection. A survey-grade echo sounder and differential global positioning system receivers were used to collect water-depth and position data in December 2003 at Sugar Creek Lake near Moberly, Missouri. Data were collected along planned transects, with an additional set of quality-assurance data collected for use in accuracy computations. All collected data were imported into a geographic information system database. A bathymetric surface model, contour map, and area/capacity tables were created from the geographic information system database. An accuracy assessment was completed on the collected data, bathymetric surface model, area/capacity table, and contour map products. Using established vertical accuracy standards, the accuracy of the collected data, bathymetric surface model, and contour map product was 0.67 foot, 0.91 foot, and 1.51 feet at the 95 percent confidence level. By comparing results from different transect intervals with the quality-assurance transect data, it was determined that a transect interval of 1 percent of the longitudinal length of Sugar Creek Lake produced nearly as good results as 0.5 percent transect interval for the bathymetric surface model, area/capacity table, and contour map products.
Results of the first complete static calibration of the RSRA rotor-load-measurement system
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.
1984-01-01
The compound Rotor System Research Aircraft (RSRA) is designed to make high-accuracy, simultaneous measurements of all rotor forces and moments in flight. Physical calibration of the rotor force- and moment-measurement system when installed in the aircraft is required to account for known errors and to ensure that measurement-system accuracy is traceable to the National Bureau of Standards. The first static calibration and associated analysis have been completed with good results. Hysteresis was a potential cause of static calibration errors, but was found to be negligible in flight compared to full-scale loads, and analytical methods have been devised to eliminate hysteresis effects on calibration data. Flight tests confirmed that the calibrated rotor-load-measurement system performs as expected in flight and that it can dependably make direct measurements of fuselage vertical drag in hover.
Radiometric Characterization of the IKONOS, QuickBird, and OrbView-3 Sensors
NASA Technical Reports Server (NTRS)
Holekamp, Kara
2006-01-01
Radiometric calibration of commercial imaging satellite products is required to ensure that science and application communities can better understand their properties. Inaccurate radiometric calibrations can lead to erroneous decisions and invalid conclusions and can limit intercomparisons with other systems. To address this calibration need, satellite at-sensor radiance values were compared to those estimated by each independent team member to determine the sensor's radiometric accuracy. The combined results of this evaluation provide the user community with an independent assessment of these commercially available high spatial resolution sensors' absolute calibration values.
The Influence of Nutrition Labeling and Point-of-Purchase Information on Food Behaviours.
Volkova, Ekaterina; Ni Mhurchu, Cliona
2015-03-01
Point-of-purchase information on packaged food has been a highly debated topic. Various types of nutrition labels and point-of-purchase information have been studied to determine their ability to attract consumers' attention, be well understood and promote healthy food choices. Country-specific regulatory and monitoring frameworks have been implemented to ensure reliability and accuracy of such information. However, the impact of such information on consumers' behaviour remains contentious. This review summarizes recent evidence on the real-world effectiveness of nutrition labels and point-of-purchase information.
Zhang, Nan; Zhou, Juan; Yu, Jinlai; Hua, Ziyu; Li, Yongxue; Wu, Jiangang
2018-05-30
Medical injection pump is a commonly used clinical equipment with high risk. Accurate detection of flow is an important aspect to ensure its reliable operation. In this paper, we carefully studied and analyzed the flow detection methods of three standards being used in medical injection pump detection in our country. The three standards were compared from the aspects of standard device, flow test point selection, length of test time and accuracy judgment. The advantages and disadvantages of these standards were analyzed and suggestions for improvement were put forward.
Dictionary-Based Tensor Canonical Polyadic Decomposition
NASA Astrophysics Data System (ADS)
Cohen, Jeremy Emile; Gillis, Nicolas
2018-04-01
To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. Performances of the proposed algorithms are evaluated on the decomposition of simulated data and the unmixing of hyperspectral images.
Murphy, S F; Lenihan, L; Orefuwa, F; Colohan, G; Hynes, I; Collins, C G
2017-05-01
The discharge letter is a key component of the communication pathway between the hospital and primary care. Accuracy and timeliness of delivery are crucial to ensure continuity of patient care. Electronic discharge summaries (EDS) and prescriptions have been shown to improve quality of discharge information for general practitioners (GPs). The aim of this study was to evaluate the effect of a new EDS on GP satisfaction levels and accuracy of discharge diagnosis. A GP survey was carried out whereby semi-structured interviews were conducted with 13 GPs from three primary care centres who receive a high volume of discharge letters from the hospital. A chart review was carried out on 90 charts to compare accuracy of ICD-10 coding of Non-Consultant Hospital Doctors (NCHDs) with that of trained Hopital In-Patient Enquiry (HIPE) coders. GP satisfaction levels were over 90 % with most aspects of the EDS, including amount of information (97 %), accuracy (95 %), GP information and follow-up (97 %) and medications (91 %). 70 % of GPs received the EDS within 2 weeks. ICD-10 coding of discharge diagnosis by NCHDs had an accuracy of 33 %, compared with 95.6 % when done by trained coders (p < 0.00001). The introduction of the EDS and prescription has led to improved quality of timeliness of communication with primary care. It has led to a very high satisfaction rating with GPs. ICD-10 coding was found to be grossly inaccurate when carried out by NCHDs and it is more appropriate for this task to be carried out by trained coders.
A method which can enhance the optical-centering accuracy
NASA Astrophysics Data System (ADS)
Zhang, Xue-min; Zhang, Xue-jun; Dai, Yi-dan; Yu, Tao; Duan, Jia-you; Li, Hua
2014-09-01
Optical alignment machining is an effective method to ensure the co-axiality of optical system. The co-axiality accuracy is determined by optical-centering accuracy of single optical unit, which is determined by the rotating accuracy of lathe and the optical-centering judgment accuracy. When the rotating accuracy of 0.2um can be achieved, the leading error can be ignored. An axis-determination tool which is based on the principle of auto-collimation can be used to determine the only position of centerscope is designed. The only position is the position where the optical axis of centerscope is coincided with the rotating axis of the lathe. Also a new optical-centering judgment method is presented. A system which includes the axis-determination tool and the new optical-centering judgment method can enhance the optical-centering accuracy to 0.003mm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Anthony; Ravi, Ananth
2014-08-15
High dose rate (HDR) remote afterloading brachytherapy involves sending a small, high-activity radioactive source attached to a cable to different positions within a hollow applicator implanted in the patient. It is critical that the source position within the applicator and the dwell time of the source are accurate. Daily quality assurance (QA) tests of the positional and dwell time accuracy are essential to ensure that the accuracy of the remote afterloader is not compromised prior to patient treatment. Our centre has developed an automated, video-based QA system for HDR brachytherapy that is dramatically superior to existing diode or film QAmore » solutions in terms of cost, objectivity, positional accuracy, with additional functionalities such as being able to determine source dwell time and transit time of the source. In our system, a video is taken of the brachytherapy source as it is sent out through a position check ruler, with the source visible through a clear window. Using a proprietary image analysis algorithm, the source position is determined with respect to time as it moves to different positions along the check ruler. The total material cost of the video-based system was under $20, consisting of a commercial webcam and adjustable stand. The accuracy of the position measurement is ±0.2 mm, and the time resolution is 30 msec. Additionally, our system is capable of robustly verifying the source transit time and velocity (a test required by the AAPM and CPQR recommendations), which is currently difficult to perform accurately.« less
How automated access verification can help organizations demonstrate HIPAA compliance: A case study.
Hill, Linda
2006-01-01
This case study of Sharp HealthCare takes an in depth look at how the organization has embedded security policies into its business process and automated workflow to ensure users are granted only the IT access that is necessary for them to perform their jobs and to ensure patient privacy Some of the most pressing audit and compliance concerns in healthcare organizations today revolve around the need to constantly review and give an account for users' IT access. The need for this at Sharp is exacerbated because of the high rate of change within the organization and the large percentage of non-employee staff such as traveling nurses moving throughout hospital departments on their rotations. By implementing a software solution to verify the accuracy of user access rights or automatically initiate appropriate corrective actions, Sharp is now able to extend the responsibility and accountability for compliance to the most appropriate resources.
NASA Technical Reports Server (NTRS)
Mathews, Kenneth W.; Wachter, James R.
2018-01-01
The purpose of this NASA Technical Standard is to ensure the accuracy of measurements affecting safety and mission success through the proper selection, calibration, and use of Measuring and Test Equipment (MTE).
Performance of the fiber-optic low-coherent ground settlement sensor: From lab to field
NASA Astrophysics Data System (ADS)
Guo, Jingjing; Tan, Yanbin; Peng, Li; Chen, Jisong; Wei, Chuanjun; Zhang, Pinglei; Zhang, Tianhang; Alrabeei, Salah; Zhang, Zhe; Sun, Changsen
2018-04-01
A fiber-optic low-coherent interferometry sensor was developed to measure the ground settlement (GS) in an accuracy of the micrometer. The sensor combined optical techniques with liquid-contained chambers that were hydraulically connected together at the bottom by using a water-filled tube. The liquid surface inside each chamber was at the same level initially. The optical interferometry was employed to read out the liquid level changes, which following the GS happened at the place where the chamber was put on and, thereby, the GS information was calculated. The laboratory effort had demonstrated its potential in the practical application. Here, the denoising algorithms on the measurement signal were carried out based on the specific environment to ensure the accuracy and stability of the system in field applications. After that, we extended this technique to the high-speed railway. The 5-days continuous measurement proved that the designed system could be applied to monitor the GS of the high-speed railway piers and approached an accuracy of ±70 μm in the field situation with a reference compensation sensor. So the performance of the sensor was suitable to the GS monitoring problem in the high-speed railway. There, the difficulties were to meet the monitoring requirement of both a large span in space and its quite tiny and slow changes.
Reweighted mass center based object-oriented sparse subspace clustering for hyperspectral images
NASA Astrophysics Data System (ADS)
Zhai, Han; Zhang, Hongyan; Zhang, Liangpei; Li, Pingxiang
2016-10-01
Considering the inevitable obstacles faced by the pixel-based clustering methods, such as salt-and-pepper noise, high computational complexity, and the lack of spatial information, a reweighted mass center based object-oriented sparse subspace clustering (RMC-OOSSC) algorithm for hyperspectral images (HSIs) is proposed. First, the mean-shift segmentation method is utilized to oversegment the HSI to obtain meaningful objects. Second, a distance reweighted mass center learning model is presented to extract the representative and discriminative features for each object. Third, assuming that all the objects are sampled from a union of subspaces, it is natural to apply the SSC algorithm to the HSI. Faced with the high correlation among the hyperspectral objects, a weighting scheme is adopted to ensure that the highly correlated objects are preferred in the procedure of sparse representation, to reduce the representation errors. Two widely used hyperspectral datasets were utilized to test the performance of the proposed RMC-OOSSC algorithm, obtaining high clustering accuracies (overall accuracy) of 71.98% and 89.57%, respectively. The experimental results show that the proposed method clearly improves the clustering performance with respect to the other state-of-the-art clustering methods, and it significantly reduces the computational time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balsara, Dinshaw S., E-mail: dbalsara@nd.edu; Amano, Takanobu, E-mail: amano@eps.s.u-tokyo.ac.jp; Garain, Sudip, E-mail: sgarain@nd.edu
In various astrophysics settings it is common to have a two-fluid relativistic plasma that interacts with the electromagnetic field. While it is common to ignore the displacement current in the ideal, classical magnetohydrodynamic limit, when the flows become relativistic this approximation is less than absolutely well-justified. In such a situation, it is more natural to consider a positively charged fluid made up of positrons or protons interacting with a negatively charged fluid made up of electrons. The two fluids interact collectively with the full set of Maxwell's equations. As a result, a solution strategy for that coupled system of equationsmore » is sought and found here. Our strategy extends to higher orders, providing increasing accuracy. The primary variables in the Maxwell solver are taken to be the facially-collocated components of the electric and magnetic fields. Consistent with such a collocation, three important innovations are reported here. The first two pertain to the Maxwell solver. In our first innovation, the magnetic field within each zone is reconstructed in a divergence-free fashion while the electric field within each zone is reconstructed in a form that is consistent with Gauss' law. In our second innovation, a multidimensionally upwinded strategy is presented which ensures that the magnetic field can be updated via a discrete interpretation of Faraday's law and the electric field can be updated via a discrete interpretation of the generalized Ampere's law. This multidimensional upwinding is achieved via a multidimensional Riemann solver. The multidimensional Riemann solver automatically provides edge-centered electric field components for the Stokes law-based update of the magnetic field. It also provides edge-centered magnetic field components for the Stokes law-based update of the electric field. The update strategy ensures that the electric field is always consistent with Gauss' law and the magnetic field is always divergence-free. This collocation also ensures that electromagnetic radiation that is propagating in a vacuum has both electric and magnetic fields that are exactly divergence-free. Coupled relativistic fluid dynamic equations are solved for the positively and negatively charged fluids. The fluids' numerical fluxes also provide a self-consistent current density for the update of the electric field. Our reconstruction strategy ensures that fluid velocities always remain sub-luminal. Our third innovation consists of an efficient design for several popular IMEX schemes so that they provide strong coupling between the finite-volume-based fluid solver and the electromagnetic fields at high order. This innovation makes it possible to efficiently utilize high order IMEX time update methods for stiff source terms in the update of high order finite-volume methods for hyperbolic conservation laws. We also show that this very general innovation should extend seamlessly to Runge–Kutta discontinuous Galerkin methods. The IMEX schemes enable us to use large CFL numbers even in the presence of stiff source terms. Several accuracy analyses are presented showing that our method meets its design accuracy in the MHD limit as well as in the limit of electromagnetic wave propagation. Several stringent test problems are also presented. We also present a relativistic version of the GEM problem, which shows that our algorithm can successfully adapt to challenging problems in high energy astrophysics.« less
NASA Astrophysics Data System (ADS)
Balsara, Dinshaw S.; Amano, Takanobu; Garain, Sudip; Kim, Jinho
2016-08-01
In various astrophysics settings it is common to have a two-fluid relativistic plasma that interacts with the electromagnetic field. While it is common to ignore the displacement current in the ideal, classical magnetohydrodynamic limit, when the flows become relativistic this approximation is less than absolutely well-justified. In such a situation, it is more natural to consider a positively charged fluid made up of positrons or protons interacting with a negatively charged fluid made up of electrons. The two fluids interact collectively with the full set of Maxwell's equations. As a result, a solution strategy for that coupled system of equations is sought and found here. Our strategy extends to higher orders, providing increasing accuracy. The primary variables in the Maxwell solver are taken to be the facially-collocated components of the electric and magnetic fields. Consistent with such a collocation, three important innovations are reported here. The first two pertain to the Maxwell solver. In our first innovation, the magnetic field within each zone is reconstructed in a divergence-free fashion while the electric field within each zone is reconstructed in a form that is consistent with Gauss' law. In our second innovation, a multidimensionally upwinded strategy is presented which ensures that the magnetic field can be updated via a discrete interpretation of Faraday's law and the electric field can be updated via a discrete interpretation of the generalized Ampere's law. This multidimensional upwinding is achieved via a multidimensional Riemann solver. The multidimensional Riemann solver automatically provides edge-centered electric field components for the Stokes law-based update of the magnetic field. It also provides edge-centered magnetic field components for the Stokes law-based update of the electric field. The update strategy ensures that the electric field is always consistent with Gauss' law and the magnetic field is always divergence-free. This collocation also ensures that electromagnetic radiation that is propagating in a vacuum has both electric and magnetic fields that are exactly divergence-free. Coupled relativistic fluid dynamic equations are solved for the positively and negatively charged fluids. The fluids' numerical fluxes also provide a self-consistent current density for the update of the electric field. Our reconstruction strategy ensures that fluid velocities always remain sub-luminal. Our third innovation consists of an efficient design for several popular IMEX schemes so that they provide strong coupling between the finite-volume-based fluid solver and the electromagnetic fields at high order. This innovation makes it possible to efficiently utilize high order IMEX time update methods for stiff source terms in the update of high order finite-volume methods for hyperbolic conservation laws. We also show that this very general innovation should extend seamlessly to Runge-Kutta discontinuous Galerkin methods. The IMEX schemes enable us to use large CFL numbers even in the presence of stiff source terms. Several accuracy analyses are presented showing that our method meets its design accuracy in the MHD limit as well as in the limit of electromagnetic wave propagation. Several stringent test problems are also presented. We also present a relativistic version of the GEM problem, which shows that our algorithm can successfully adapt to challenging problems in high energy astrophysics.
NASA Astrophysics Data System (ADS)
Chiron, L.; Oger, G.; de Leffe, M.; Le Touzé, D.
2018-02-01
While smoothed-particle hydrodynamics (SPH) simulations are usually performed using uniform particle distributions, local particle refinement techniques have been developed to concentrate fine spatial resolutions in identified areas of interest. Although the formalism of this method is relatively easy to implement, its robustness at coarse/fine interfaces can be problematic. Analysis performed in [16] shows that the radius of refined particles should be greater than half the radius of unrefined particles to ensure robustness. In this article, the basics of an Adaptive Particle Refinement (APR) technique, inspired by AMR in mesh-based methods, are presented. This approach ensures robustness with alleviated constraints. Simulations applying the new formalism proposed achieve accuracy comparable to fully refined spatial resolutions, together with robustness, low CPU times and maintained parallel efficiency.
Assessing Videogrammetry for Static Aeroelastic Testing of a Wind-Tunnel Model
NASA Technical Reports Server (NTRS)
Spain, Charles V.; Heeg, Jennifer; Ivanco, Thomas G.; Barrows, Danny A.; Florance, James R.; Burner, Alpheus W.; DeMoss, Joshua; Lively, Peter S.
2004-01-01
The Videogrammetric Model Deformation (VMD) technique, developed at NASA Langley Research Center, was recently used to measure displacements and local surface angle changes on a static aeroelastic wind-tunnel model. The results were assessed for consistency, accuracy and usefulness. Vertical displacement measurements and surface angular deflections (derived from vertical displacements) taken at no-wind/no-load conditions were analyzed. For accuracy assessment, angular measurements were compared to those from a highly accurate accelerometer. Shewhart's Variables Control Charts were used in the assessment of consistency and uncertainty. Some bad data points were discovered, and it is shown that the measurement results at certain targets were more consistent than at other targets. Physical explanations for this lack of consistency have not been determined. However, overall the measurements were sufficiently accurate to be very useful in monitoring wind-tunnel model aeroelastic deformation and determining flexible stability and control derivatives. After a structural model component failed during a highly loaded condition, analysis of VMD data clearly indicated progressive structural deterioration as the wind-tunnel condition where failure occurred was approached. As a result, subsequent testing successfully incorporated near- real-time monitoring of VMD data in order to ensure structural integrity. The potential for higher levels of consistency and accuracy through the use of statistical quality control practices are discussed and recommended for future applications.
Highly accurate fast lung CT registration
NASA Astrophysics Data System (ADS)
Rühaak, Jan; Heldmann, Stefan; Kipshagen, Till; Fischer, Bernd
2013-03-01
Lung registration in thoracic CT scans has received much attention in the medical imaging community. Possible applications range from follow-up analysis, motion correction for radiation therapy, monitoring of air flow and pulmonary function to lung elasticity analysis. In a clinical environment, runtime is always a critical issue, ruling out quite a few excellent registration approaches. In this paper, a highly efficient variational lung registration method based on minimizing the normalized gradient fields distance measure with curvature regularization is presented. The method ensures diffeomorphic deformations by an additional volume regularization. Supplemental user knowledge, like a segmentation of the lungs, may be incorporated as well. The accuracy of our method was evaluated on 40 test cases from clinical routine. In the EMPIRE10 lung registration challenge, our scheme ranks third, with respect to various validation criteria, out of 28 algorithms with an average landmark distance of 0.72 mm. The average runtime is about 1:50 min on a standard PC, making it by far the fastest approach of the top-ranking algorithms. Additionally, the ten publicly available DIR-Lab inhale-exhale scan pairs were registered to subvoxel accuracy at computation times of only 20 seconds. Our method thus combines very attractive runtimes with state-of-the-art accuracy in a unique way.
Ranging error analysis of single photon satellite laser altimetry under different terrain conditions
NASA Astrophysics Data System (ADS)
Huang, Jiapeng; Li, Guoyuan; Gao, Xiaoming; Wang, Jianmin; Fan, Wenfeng; Zhou, Shihong
2018-02-01
Single photon satellite laser altimeter is based on Geiger model, which has the characteristics of small spot, high repetition rate etc. In this paper, for the slope terrain, the distance of error's formula and numerical calculation are carried out. Monte Carlo method is used to simulate the experiment of different terrain measurements. The experimental results show that ranging accuracy is not affected by the spot size under the condition of the flat terrain, But the inclined terrain can influence the ranging error dramatically, when the satellite pointing angle is 0.001° and the terrain slope is about 12°, the ranging error can reach to 0.5m. While the accuracy can't meet the requirement when the slope is more than 70°. Monte Carlo simulation results show that single photon laser altimeter satellite with high repetition rate can improve the ranging accuracy under the condition of complex terrain. In order to ensure repeated observation of the same point for 25 times, according to the parameters of ICESat-2, we deduce the quantitative relation between the footprint size, footprint, and the frequency repetition. The related conclusions can provide reference for the design and demonstration of the domestic single photon laser altimetry satellite.
Black, Bryan A; Griffin, Daniel; van der Sleen, Peter; Wanamaker, Alan D; Speer, James H; Frank, David C; Stahle, David W; Pederson, Neil; Copenheaver, Carolyn A; Trouet, Valerie; Griffin, Shelly; Gillanders, Bronwyn M
2016-07-01
High-resolution biogenic and geologic proxies in which one increment or layer is formed per year are crucial to describing natural ranges of environmental variability in Earth's physical and biological systems. However, dating controls are necessary to ensure temporal precision and accuracy; simple counts cannot ensure that all layers are placed correctly in time. Originally developed for tree-ring data, crossdating is the only such procedure that ensures all increments have been assigned the correct calendar year of formation. Here, we use growth-increment data from two tree species, two marine bivalve species, and a marine fish species to illustrate sensitivity of environmental signals to modest dating error rates. When falsely added or missed increments are induced at one and five percent rates, errors propagate back through time and eliminate high-frequency variability, climate signals, and evidence of extreme events while incorrectly dating and distorting major disturbances or other low-frequency processes. Our consecutive Monte Carlo experiments show that inaccuracies begin to accumulate in as little as two decades and can remove all but decadal-scale processes after as little as two centuries. Real-world scenarios may have even greater consequence in the absence of crossdating. Given this sensitivity to signal loss, the fundamental tenets of crossdating must be applied to fully resolve environmental signals, a point we underscore as the frontiers of growth-increment analysis continue to expand into tropical, freshwater, and marine environments. © 2016 John Wiley & Sons Ltd.
Knowledge discovery by accuracy maximization
Cacciatore, Stefano; Luchinat, Claudio; Tenori, Leonardo
2014-01-01
Here we describe KODAMA (knowledge discovery by accuracy maximization), an unsupervised and semisupervised learning algorithm that performs feature extraction from noisy and high-dimensional data. Unlike other data mining methods, the peculiarity of KODAMA is that it is driven by an integrated procedure of cross-validation of the results. The discovery of a local manifold’s topology is led by a classifier through a Monte Carlo procedure of maximization of cross-validated predictive accuracy. Briefly, our approach differs from previous methods in that it has an integrated procedure of validation of the results. In this way, the method ensures the highest robustness of the obtained solution. This robustness is demonstrated on experimental datasets of gene expression and metabolomics, where KODAMA compares favorably with other existing feature extraction methods. KODAMA is then applied to an astronomical dataset, revealing unexpected features. Interesting and not easily predictable features are also found in the analysis of the State of the Union speeches by American presidents: KODAMA reveals an abrupt linguistic transition sharply separating all post-Reagan from all pre-Reagan speeches. The transition occurs during Reagan’s presidency and not from its beginning. PMID:24706821
Multi-Component Diffusion with Application To Computational Aerothermodynamics
NASA Technical Reports Server (NTRS)
Sutton, Kenneth; Gnoffo, Peter A.
1998-01-01
The accuracy and complexity of solving multicomponent gaseous diffusion using the detailed multicomponent equations, the Stefan-Maxwell equations, and two commonly used approximate equations have been examined in a two part study. Part I examined the equations in a basic study with specified inputs in which the results are applicable for many applications. Part II addressed the application of the equations in the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA) computational code for high-speed entries in Earth's atmosphere. The results showed that the presented iterative scheme for solving the Stefan-Maxwell equations is an accurate and effective method as compared with solutions of the detailed equations. In general, good accuracy with the approximate equations cannot be guaranteed for a species or all species in a multi-component mixture. 'Corrected' forms of the approximate equations that ensured the diffusion mass fluxes sum to zero, as required, were more accurate than the uncorrected forms. Good accuracy, as compared with the Stefan- Maxwell results, were obtained with the 'corrected' approximate equations in defining the heating rates for the three Earth entries considered in Part II.
Certified reference materials (GBW09170 and 09171) of creatinine in human serum.
Dai, Xinhua; Fang, Xiang; Shao, Mingwu; Li, Ming; Huang, Zejian; Li, Hongmei; Jiang, You; Song, Dewei; He, Yajuan
2011-02-15
Creatinine is the most widely used clinical marker for assessing renal function. Concentrations of creatinine in human serum need to be carefully checked in order to ensure accurate diagnosis of renal function. Therefore, development of certified reference materials (CRMs) of creatinine in serum is of increasing importance. In this study, two new CRMs (Nos. GBW09170 and 09171) for creatinine in human serum have been developed. They were prepared with mixtures of several dozens of healthy people's and kidney disease patient's serum, respectively. The certified values of 8.10, 34.1 mg/kg for these two CRMs have been assigned by liquid chromatography-isotope dilution mass spectrometry (LC-IDMS) method which was validated by using standard reference material (SRM) of SRM909b (a reference material obtained from National Institute of Standards and Technology, NIST). The expanded uncertainties of certified values for low and high concentrations were estimated to be 1.2 and 1.1%, respectively. The certified values were further confirmed by an international intercomparison for the determination of creatinine in human serum (Consultative Committee for Amount of Substance, CCQM) of K80 (CCQM-K80). These new CRMs of creatinine in human serum pool are totally native without additional creatinine spiked for enrichment. These new CRMs are capable of validating routine clinical methods for ensuring accuracy, reliability and comparability of analytical results from different clinical laboratories. They can also be used for instrument validation, development of secondary reference materials, and evaluating the accuracy of high order clinical methods for the determination of creatinine in human serum. Copyright © 2011 Elsevier B.V. All rights reserved.
Effect of high altitude on blood glucose meter performance.
Fink, Kenneth S; Christensen, Dale B; Ellsworth, Allan
2002-01-01
Participation in high-altitude wilderness activities may expose persons to extreme environmental conditions, and for those with diabetes mellitus, euglycemia is important to ensure safe travel. We conducted a field assessment of the precision and accuracy of seven commonly used blood glucose meters while mountaineering on Mount Rainier, located in Washington State (elevation 14,410 ft). At various elevations each climber-subject used the randomly assigned device to measure the glucose level of capillary blood and three different concentrations of standardized control solutions, and a venous sample was also collected for later glucose analysis. Ordinary least squares regression was used to assess the effect of elevation and of other environmental potential covariates on the precision and accuracy of blood glucose meters. Elevation affects glucometer precision (p = 0.08), but becomes less significant (p = 0.21) when adjusted for temperature and relative humidity. The overall effect of elevation was to underestimate glucose levels by approximately 1-2% (unadjusted) for each 1,000 ft gain in elevation. Blood glucose meter accuracy was affected by elevation (p = 0.03), temperature (p < 0.01), and relative humidity (p = 0.04) after adjustment for the other variables. The interaction between elevation and relative humidity had a meaningful but not statistically significant effect on accuracy (p = 0.07). Thus, elevation, temperature, and relative humidity affect blood glucose meter performance, and elevated glucose levels are more greatly underestimated at higher elevations. Further research will help to identify which blood glucose meters are best suited for specific environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roring, J; Saenz, D; Cruz, W
2015-06-15
Purpose: The commissioning criteria of water tank phantoms are essential for proper accuracy and reproducibility in a clinical setting. This study outlines the results of mechanical and dosimetric testing between PTW MP3-M water tank system and the Standard Imaging Doseview 3D water tank system. Methods: Measurements were taken of each axis of movement on the tank using 30 cm calipers at 1, 5, 10, 50, 100, and 200 mm for accuracy and reproducibility of tank movement. Dosimetric quantities such as percent depth dose and dose profiles were compared between tanks using a 6 MV beam from a Varian 23EX LINAC.more » Properties such as scanning speed effects, central axis depth dose agreement with static measurements, reproducibility of measurements, symmetry and flatness, and scan time between tanks were also investigated. Results: Results showed high geometric accuracy within 0.2 mm. Central axis PDD and in-field profiles agreed within 0.75% between the tanks. These outcomes test many possible discrepancies in dose measurements across the two tanks and form a basis for comparison on a broader range of tanks in the future. Conclusion: Both 3D water scanning phantoms possess a high degree of spatial accuracy, allowing for equivalence in measurements regardless of the phantom used. A commissioning procedure when changing water tanks or upon receipt of a new tank is nevertheless critical to ensure consistent operation before and after the arrival of new hardware.« less
Freckmann, Guido; Baumstark, Annette; Schmid, Christina; Pleus, Stefan; Link, Manuela; Haug, Cornelia
2014-02-01
Systems for self-monitoring of blood glucose (SMBG) have to provide accurate and reproducible blood glucose (BG) values in order to ensure adequate therapeutic decisions by people with diabetes. Twelve SMBG systems were compared in a standardized manner under controlled laboratory conditions: nine systems were available on the German market and were purchased from a local pharmacy, and three systems were obtained from the manufacturer (two systems were available on the U.S. market, and one system was not yet introduced to the German market). System accuracy was evaluated following DIN EN ISO (International Organization for Standardization) 15197:2003. In addition, measurement reproducibility was assessed following a modified TNO (Netherlands Organization for Applied Scientific Research) procedure. Comparison measurements were performed with either the glucose oxidase method (YSI 2300 STAT Plus™ glucose analyzer; YSI Life Sciences, Yellow Springs, OH) or the hexokinase method (cobas(®) c111; Roche Diagnostics GmbH, Mannheim, Germany) according to the manufacturer's measurement procedure. The 12 evaluated systems showed between 71.5% and 100% of the measurement results within the required system accuracy limits. Ten systems fulfilled with the evaluated test strip lot minimum accuracy requirements specified by DIN EN ISO 15197:2003. In addition, accuracy limits of the recently published revision ISO 15197:2013 were applied and showed between 54.5% and 100% of the systems' measurement results within the required accuracy limits. Regarding measurement reproducibility, each of the 12 tested systems met the applied performance criteria. In summary, 83% of the systems fulfilled with the evaluated test strip lot minimum system accuracy requirements of DIN EN ISO 15197:2003. Each of the tested systems showed acceptable measurement reproducibility. In order to ensure sufficient measurement quality of each distributed test strip lot, regular evaluations are required.
Accuracy of external cause-of-injury coding in VA polytrauma patient discharge records.
Carlson, Kathleen F; Nugent, Sean M; Grill, Joseph; Sayer, Nina A
2010-01-01
Valid and efficient methods of identifying the etiology of treated injuries are critical for characterizing patient populations and developing prevention and rehabilitation strategies. We examined the accuracy of external cause-of-injury codes (E-codes) in Veterans Health Administration (VHA) administrative data for a population of injured patients. Chart notes and E-codes were extracted for 566 patients treated at any one of four VHA Polytrauma Rehabilitation Center sites between 2001 and 2006. Two expert coders, blinded to VHA E-codes, used chart notes to assign "gold standard" E-codes to injured patients. The accuracy of VHA E-coding was examined based on these gold standard E-codes. Only 382 of 517 (74%) injured patients were assigned E-codes in VHA records. Sensitivity of VHA E-codes varied significantly by site (range: 59%-91%, p < 0.001). Sensitivity was highest for combat-related injuries (81%) and lowest for fall-related injuries (60%). Overall specificity of E-codes was high (92%). E-coding accuracy was markedly higher when we restricted analyses to records that had been assigned VHA E-codes. E-codes may not be valid for ascertaining source-of-injury data for all injuries among VHA rehabilitation inpatients at this time. Enhanced training and policies may ensure more widespread, standardized use and accuracy of E-codes for injured veterans treated in the VHA.
Xue, Ligang; Mikkelsen, Kristian Handberg
2013-03-01
The objective of this study was to assess the dose accuracy of NovoPen® 5 in delivering low, medium and high doses of insulin before and after simulated lifetime use. A secondary objective was to evaluate the durability of the pen and its memory function under various stress conditions designed to simulate conditions that may be encountered in everyday use of an insulin pen. All testing was conducted according to International Organization for Standardization guideline 11608-1, 2000 for pen injectors. Dose accuracy was measured for the delivery of 1 unit (U) (10 mg), 30 U (300 mg) and 60 U (600 mg) test medium in standard, cool and hot conditions and before and after simulated lifetime use. Dose accuracy was also tested after preconditioning in dry heat storage; cold storage; damp cyclical heat; shock, bump and vibration; free fall and after electrostatic charge and radiated field test. Memory function was tested under all temperature and physical conditions. NovoPen 5 maintained dosing accuracy and memory function at minimum, medium and maximum doses in standard, cool and hot conditions, stress tests and simulated lifetime use. The pens remained intact and retained dosing accuracy and a working memory function at all doses after exposure to variations in temperature and after physical challenge. NovoPen 5 was accurate at all doses tested and under various functionality tests. Its durable design ensured that the dose accuracy and memory function were retained under conditions of stress likely to be encountered in everyday use.
NASA Astrophysics Data System (ADS)
Liang, Q.; Wu, W.; Zhang, D.; Wei, B.; Sun, W.; Wang, Y.; Ge, Y.
2015-10-01
Roughness, which can represent the trade-off between manufacturing cost and performance of mechanical components, is a critical predictor of cracks, corrosion and fatigue damage. In order to measure polished or super-finished surfaces, a novel touch probe based on three-component force sensor for characterizing and quantifying surface roughness is proposed by using silicon micromachining technology. The sensor design is based on a cross-beam structure, which ensures that the system possesses high sensitivity and low coupling. The results show that the proposed sensor possesses high sensitivity, low coupling error, and temperature compensation function. The proposed system can be used to investigate micromechanical structures with nanometer accuracy.
Accurate, robust and reliable calculations of Poisson-Boltzmann binding energies
Nguyen, Duc D.; Wang, Bao
2017-01-01
Poisson-Boltzmann (PB) model is one of the most popular implicit solvent models in biophysical modeling and computation. The ability of providing accurate and reliable PB estimation of electrostatic solvation free energy, ΔGel, and binding free energy, ΔΔGel, is important to computational biophysics and biochemistry. In this work, we investigate the grid dependence of our PB solver (MIBPB) with SESs for estimating both electrostatic solvation free energies and electrostatic binding free energies. It is found that the relative absolute error of ΔGel obtained at the grid spacing of 1.0 Å compared to ΔGel at 0.2 Å averaged over 153 molecules is less than 0.2%. Our results indicate that the use of grid spacing 0.6 Å ensures accuracy and reliability in ΔΔGel calculation. In fact, the grid spacing of 1.1 Å appears to deliver adequate accuracy for high throughput screening. PMID:28211071
Revealing the glass transition in shape memory polymers using Brillouin spectroscopy.
Steelman, Zachary A; Weems, Andrew C; Traverso, Andrew J; Szafron, Jason M; Maitland, Duncan J; Yakovlev, Vladislav V
2017-12-11
Emerging medical devices which employ shape memory polymers (SMPs) require precise measurements of the glass transition temperature (T g ) to ensure highly controlled shape recovery kinetics. Conventional techniques like differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA) have limitations that prevent utilization for certain devices, including limited accuracy and the need for sacrificial samples. In this report, we employ an approach based on Brillouin spectroscopy to probe the glass transition of SMPs rapidly, remotely, and nondestructively. Further, we compare the T g obtained from Brillouin scattering with DMA- and DSC-measured T g to demonstrate the accuracy of Brillouin scattering for this application. We conclude that Brillouin spectroscopy is an accurate technique for obtaining the glass transition temperature of SMPs, aligning closely with the most common laboratory standards while providing a rapid, remote, and nondestructive method for the analysis of unique polymeric medical devices.
Evolution of Wikipedia’s medical content: past, present and future
Kipersztok, Lisa
2017-01-01
As one of the most commonly read online sources of medical information, Wikipedia is an influential public health platform. Its medical content, community, collaborations and challenges have been evolving since its creation in 2001, and engagement by the medical community is vital for ensuring its accuracy and completeness. Both the encyclopaedia’s internal metrics as well as external assessments of its quality indicate that its articles are highly variable, but improving. Although content can be edited by anyone, medical articles are primarily written by a core group of medical professionals. Diverse collaborative ventures have enhanced medical article quality and reach, and opportunities for partnerships are more available than ever. Nevertheless, Wikipedia’s medical content and community still face significant challenges, and a socioecological model is used to structure specific recommendations. We propose that the medical community should prioritise the accuracy of biomedical information in the world’s most consulted encyclopaedia. PMID:28847845
Chernyak, Dimitri A; Campbell, Charles E
2003-11-01
Now that excimer laser systems can be programmed to correct complex aberrations of the eye on the basis of wave-front measurements, a method is needed to test the accuracy of the system from measurement through treatment. A closed-loop test method was developed to ensure that treatment plans generated by a wavefront measuring system were accurately transferred to and executed by the excimer laser. A surface was analytically defined, and a Shack-Hartmann-based wave-front system was used to formulate a treatment plan, which was downloaded to an excimer laser system. A plastic lens was ablated by the laser and then returned to the wave-front device, where it was measured and compared with the analytically defined wave-front surface. The two surfaces agreed up to 6th-order Zernike terms, validating the accuracy of the system.
Revealing the glass transition in shape memory polymers using Brillouin spectroscopy
NASA Astrophysics Data System (ADS)
Steelman, Zachary A.; Weems, Andrew C.; Traverso, Andrew J.; Szafron, Jason M.; Maitland, Duncan J.; Yakovlev, Vladislav V.
2017-12-01
Emerging medical devices which employ shape memory polymers (SMPs) require precise measurements of the glass transition temperature (Tg) to ensure highly controlled shape recovery kinetics. Conventional techniques like differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA) have limitations that prevent utilization for certain devices, including limited accuracy and the need for sacrificial samples. In this report, we employ an approach based on Brillouin spectroscopy to probe the glass transition of SMPs rapidly, remotely, and nondestructively. Further, we compare the Tg obtained from Brillouin scattering with DMA- and DSC-measured Tg to demonstrate the accuracy of Brillouin scattering for this application. We conclude that Brillouin spectroscopy is an accurate technique for obtaining the glass transition temperature of SMPs, aligning closely with the most common laboratory standards while providing a rapid, remote, and nondestructive method for the analysis of unique polymeric medical devices.
Validation of minicams for measuring concentrations of chemical agent in environmental air
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menton, R.G.; Hayes, T.L.; Chou, Y.L.
1993-05-13
Environmental monitoring for chemical agents is necessary to ensure that notification and appropriate action will be taken in the, event that there is a release exceeding control limits of such agents into the workplace outside of engineering controls. Prior to implementing new analytical procedures for environmental monitoring, precision and accuracy (PA) tests are conducted to ensure that an agent monitoring system performs according to specified accuracy, precision, and sensitivity requirements. This testing not only establishes the accuracy and precision of the method, but also determines what factors can affect the method's performance. Performance measures that are particularly important in agentmore » monitoring include the Detection Limit (DL), Decision Limit (DC), Found Action Level (FAL), and the Target Action Level (TAL). PA experiments were performed at Battelle's Medical Research and Evaluation Facility (MREF) to validate the use of the miniature chemical agent monitoring system (MINICAMs) for measuring environmental air concentrations of sulfur mustard (HD). This presentation discusses the experimental and statistical approaches for characterizing the performance of MINICAMS for measuring HD in air.« less
Calibration of PMIS pavement performance prediction models.
DOT National Transportation Integrated Search
2012-02-01
Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...
Leduc, Nicolas; Atallah, Vincent; Escarmant, Patrick; Vinh-Hung, Vincent
2016-09-08
Monitoring and controlling respiratory motion is a challenge for the accuracy and safety of therapeutic irradiation of thoracic tumors. Various commercial systems based on the monitoring of internal or external surrogates have been developed but remain costly. In this article we describe and validate Madibreast, an in-house-made respiratory monitoring and processing device based on optical tracking of external markers. We designed an optical apparatus to ensure real-time submillimetric image resolution at 4 m. Using OpenCv libraries, we optically tracked high-contrast markers set on patients' breasts. Validation of spatial and time accuracy was performed on a mechanical phantom and on human breast. Madibreast was able to track motion of markers up to a 5 cm/s speed, at a frame rate of 30 fps, with submillimetric accuracy on mechanical phantom and human breasts. Latency was below 100 ms. Concomitant monitoring of three different locations on the breast showed discrepancies in axial motion up to 4 mm for deep-breathing patterns. This low-cost, computer-vision system for real-time motion monitoring of the irradiation of breast cancer patients showed submillimetric accuracy and acceptable latency. It allowed the authors to highlight differences in surface motion that may be correlated to tumor motion.v. © 2016 The Authors.
Developing and implementing a high precision setup system
NASA Astrophysics Data System (ADS)
Peng, Lee-Cheng
The demand for high-precision radiotherapy (HPRT) was first implemented in stereotactic radiosurgery using a rigid, invasive stereotactic head frame. Fractionated stereotactic radiotherapy (SRT) with a frameless device was developed along a growing interest in sophisticated treatment with a tight margin and high-dose gradient. This dissertation establishes the complete management for HPRT in the process of frameless SRT, including image-guided localization, immobilization, and dose evaluation. The most ideal and precise positioning system can allow for ease of relocation, real-time patient movement assessment, high accuracy, and no additional dose in daily use. A new image-guided stereotactic positioning system (IGSPS), the Align RT3C 3D surface camera system (ART, VisionRT), which combines 3D surface images and uses a real-time tracking technique, was developed to ensure accurate positioning at the first place. The uncertainties of current optical tracking system, which causes patient discomfort due to additional bite plates using the dental impression technique and external markers, are found. The accuracy and feasibility of ART is validated by comparisons with the optical tracking and cone-beam computed tomography (CBCT) systems. Additionally, an effective daily quality assurance (QA) program for the linear accelerator and multiple IGSPSs is the most important factor to ensure system performance in daily use. Currently, systematic errors from the phantom variety and long measurement time caused by switching phantoms were discovered. We investigated the use of a commercially available daily QA device to improve the efficiency and thoroughness. Reasonable action level has been established by considering dosimetric relevance and clinic flow. As for intricate treatments, the effect of dose deviation caused by setup errors remains uncertain on tumor coverage and toxicity on OARs. The lack of adequate dosimetric simulations based on the true treatment coordinates from the treatment planning system (TPS) has limited adaptive treatments. A reliable and accurate dosimetric simulation using TPS and in-house software in uncorrected errors has been developed. In SRT, the calculated dose deviation is compared to the original treatment dose with the dose-volume histogram to investigate the dose effect of rotational errors. In summary, this work performed a quality assessment to investigate the overall accuracy of current setup systems. To reach the ideal HPRT, the reliable dosimetric simulation, an effective daily QA program and effective, precise setup systems were developed and validated.
Strep Test: Rapid (For Parents)
... third of negative rapid strep test results are false (meaning someone actually has a strep throat infection even though the rapid strep results were negative). A throat culture may then be done to ensure accuracy. Risks ...
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Ensure the accuracy and currency of the safety data; (ii) Identify factors that affect the priority of...(a)(1) shall be used: (1) For developing basic source data in the planning process in accordance with...
Field programmable gate array-assigned complex-valued computation and its limits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard-Schwarz, Maria, E-mail: maria.bernardschwarz@ni.com; Institute of Applied Physics, TU Wien, Wiedner Hauptstrasse 8, 1040 Wien; Zwick, Wolfgang
We discuss how leveraging Field Programmable Gate Array (FPGA) technology as part of a high performance computing platform reduces latency to meet the demanding real time constraints of a quantum optics simulation. Implementations of complex-valued operations using fixed point numeric on a Virtex-5 FPGA compare favorably to more conventional solutions on a central processing unit. Our investigation explores the performance of multiple fixed point options along with a traditional 64 bits floating point version. With this information, the lowest execution times can be estimated. Relative error is examined to ensure simulation accuracy is maintained.
Reducing Bolt Preload Variation with Angle-of-Twist Bolt Loading
NASA Technical Reports Server (NTRS)
Thompson, Bryce; Nayate, Pramod; Smith, Doug; McCool, Alex (Technical Monitor)
2001-01-01
Critical high-pressure sealing joints on the Space Shuttle reusable solid rocket motor require precise control of bolt preload to ensure proper joint function. As the reusable solid rocket motor experiences rapid internal pressurization, correct bolt preloads maintain the sealing capability and structural integrity of the hardware. The angle-of-twist process provides the right combination of preload accuracy, reliability, process control, and assembly-friendly design. It improves significantly over previous methods. The sophisticated angle-of-twist process controls have yielded answers to all discrepancies encountered while the simplicity of the root process has assured joint preload reliability.
Williams, John; Bialer, Meir; Johannessen, Svein I; Krämer, Günther; Levy, René; Mattson, Richard H; Perucca, Emilio; Patsalos, Philip N; Wilson, John F
2003-01-01
To assess interlaboratory variability in the determination of serum levels of new antiepileptic drugs (AEDs). Lyophilised serum samples containing clinically relevant concentrations of felbamate (FBM), gabapentin (GBP), lamotrigine (LTG), the monohydroxy derivative of oxcarbazepine (OCBZ; MHD), tiagabine (TGB), topiramate (TPM), and vigabatrin (VGB) were distributed monthly among 70 laboratories participating in the international Heathcontrol External Quality Assessment Scheme (EQAS). Assay results returned over a 15-month period were evaluated for precision and accuracy. The most frequently measured compound was LTG (65), followed by MHD (39), GBP (19), TPM (18), VGB (15), FBM (16), and TGB (8). High-performance liquid chromatography was the most commonly used assay technique for all drugs except for TPM, for which two thirds of laboratories used a commercial immunoassay. For all assay methods combined, precision was <11% for MHD, FBM, TPM, and LTG, close to 15% for GBP and VGB, and as high as 54% for TGB (p < 0.001). Mean accuracy values were <10% for all drugs other than TGB, for which measured values were on average 13.9% higher than spiked values, with a high variability around the mean (45%). No differences in precision and accuracy were found between methods, except for TPM, for which gas chromatography showed poorer accuracy compared with immunoassay and gas chromatography-mass spectrometry. With the notable exception of TGB, interlaboratory variability in the determination of new AEDs was comparable to that reported with older-generation agents. Poor assay performance is related more to individual operators than to the intrinsic characteristics of the method applied. Participation in an EQAS scheme is recommended to ensure adequate control of assay variability in therapeutic drug monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, M; Feigenberg, S
Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patientmore » position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jan Hesthaven
2012-02-06
Final report for DOE Contract DE-FG02-98ER25346 entitled Parallel High Order Accuracy Methods Applied to Non-Linear Hyperbolic Equations and to Problems in Materials Sciences. Principal Investigator Jan S. Hesthaven Division of Applied Mathematics Brown University, Box F Providence, RI 02912 Jan.Hesthaven@Brown.edu February 6, 2012 Note: This grant was originally awarded to Professor David Gottlieb and the majority of the work envisioned reflects his original ideas. However, when Prof Gottlieb passed away in December 2008, Professor Hesthaven took over as PI to ensure proper mentoring of students and postdoctoral researchers already involved in the project. This unusual circumstance has naturally impacted themore » project and its timeline. However, as the report reflects, the planned work has been accomplished and some activities beyond the original scope have been pursued with success. Project overview and main results The effort in this project focuses on the development of high order accurate computational methods for the solution of hyperbolic equations with application to problems with strong shocks. While the methods are general, emphasis is on applications to gas dynamics with strong shocks.« less
Village Building Identification Based on Ensemble Convolutional Neural Networks
Guo, Zhiling; Chen, Qi; Xu, Yongwei; Shibasaki, Ryosuke; Shao, Xiaowei
2017-01-01
In this study, we present the Ensemble Convolutional Neural Network (ECNN), an elaborate CNN frame formulated based on ensembling state-of-the-art CNN models, to identify village buildings from open high-resolution remote sensing (HRRS) images. First, to optimize and mine the capability of CNN for village mapping and to ensure compatibility with our classification targets, a few state-of-the-art models were carefully optimized and enhanced based on a series of rigorous analyses and evaluations. Second, rather than directly implementing building identification by using these models, we exploited most of their advantages by ensembling their feature extractor parts into a stronger model called ECNN based on the multiscale feature learning method. Finally, the generated ECNN was applied to a pixel-level classification frame to implement object identification. The proposed method can serve as a viable tool for village building identification with high accuracy and efficiency. The experimental results obtained from the test area in Savannakhet province, Laos, prove that the proposed ECNN model significantly outperforms existing methods, improving overall accuracy from 96.64% to 99.26%, and kappa from 0.57 to 0.86. PMID:29084154
Relevance of deep learning to facilitate the diagnosis of HER2 status in breast cancer
Vandenberghe, Michel E.; Scott, Marietta L. J.; Scorer, Paul W.; Söderberg, Magnus; Balcerzak, Denis; Barker, Craig
2017-01-01
Tissue biomarker scoring by pathologists is central to defining the appropriate therapy for patients with cancer. Yet, inter-pathologist variability in the interpretation of ambiguous cases can affect diagnostic accuracy. Modern artificial intelligence methods such as deep learning have the potential to supplement pathologist expertise to ensure constant diagnostic accuracy. We developed a computational approach based on deep learning that automatically scores HER2, a biomarker that defines patient eligibility for anti-HER2 targeted therapies in breast cancer. In a cohort of 71 breast tumour resection samples, automated scoring showed a concordance of 83% with a pathologist. The twelve discordant cases were then independently reviewed, leading to a modification of diagnosis from initial pathologist assessment for eight cases. Diagnostic discordance was found to be largely caused by perceptual differences in assessing HER2 expression due to high HER2 staining heterogeneity. This study provides evidence that deep learning aided diagnosis can facilitate clinical decision making in breast cancer by identifying cases at high risk of misdiagnosis. PMID:28378829
Defense Acquisitions: How and Where DOD Spends Its Contracting Dollars
2015-04-30
process . GSA is undertaking a multi-year effort to improve the reliability and usefulness of the information contained in FPDS and other federal... Improve FPDS According to GSA, a number of data systems, including FPDS, are undergoing a significant overhaul. This overhaul is a multi-year process ...data accuracy and completeness, then initiating a process to ensure that these standards are met, would improve data accuracy and completeness.” U.S
Horsley, Alex; Macleod, Kenneth; Gupta, Ruchi; Goddard, Nick; Bell, Nicholas
2014-01-01
Background The Innocor device contains a highly sensitive photoacoustic gas analyser that has been used to perform multiple breath washout (MBW) measurements using very low concentrations of the tracer gas SF6. Use in smaller subjects has been restricted by the requirement for a gas analyser response time of <100 ms, in order to ensure accurate estimation of lung volumes at rapid ventilation rates. Methods A series of previously reported and novel enhancements were made to the gas analyser to produce a clinically practical system with a reduced response time. An enhanced lung model system, capable of delivering highly accurate ventilation rates and volumes, was used to assess in vitro accuracy of functional residual capacity (FRC) volume calculation and the effects of flow and gas signal alignment on this. Results 10–90% rise time was reduced from 154 to 88 ms. In an adult/child lung model, accuracy of volume calculation was −0.9 to 2.9% for all measurements, including those with ventilation rate of 30/min and FRC of 0.5 L; for the un-enhanced system, accuracy deteriorated at higher ventilation rates and smaller FRC. In a separate smaller lung model (ventilation rate 60/min, FRC 250 ml, tidal volume 100 ml), mean accuracy of FRC measurement for the enhanced system was minus 0.95% (range −3.8 to 2.0%). Error sensitivity to flow and gas signal alignment was increased by ventilation rate, smaller FRC and slower analyser response time. Conclusion The Innocor analyser can be enhanced to reliably generate highly accurate FRC measurements down at volumes as low as those simulating infant lung settings. Signal alignment is a critical factor. With these enhancements, the Innocor analyser exceeds key technical component recommendations for MBW apparatus. PMID:24892522
Pang, Guo-Fang; Fan, Chun-Lin; Chang, Qiao-Ying; Li, Jian-Xun; Kang, Jian; Lu, Mei-Ling
2018-03-22
This paper uses the LC-quadrupole-time-of-flight MS technique to evaluate the behavioral characteristics of MSof 485 pesticides under different conditions and has developed an accurate mass database and spectra library. A high-throughput screening and confirmation method has been developed for the 485 pesticides in fruits and vegetables. Through the optimization of parameters such as accurate mass number, time of retention window, ionization forms, etc., the method has improved the accuracy of pesticide screening, thus avoiding the occurrence of false-positive and false-negative results. The method features a full scan of fragments, with 80% of pesticide qualitative points over 10, which helps increase pesticide qualitative accuracy. The abundant differences of fragment categories help realize the effective separation and qualitative identification of isomer pesticides. Four different fruits and vegetables-apples, grapes, celery, and tomatoes-were chosen to evaluate the efficiency of the method at three fortification levels of 5, 10, and 20 μg/kg, and satisfactory results were obtained. With this method, a national survey of pesticide residues was conducted between 2012 and 2015 for 12 551 samples of 146 different fruits and vegetables collected from 638 sampling points in 284 counties across 31 provincial capitals/cities directly under the central government, which provided scientific data backup for ensuring pesticide residue safety of the fruits and vegetables consumed daily by the public. Meanwhile, the big data statistical analysis of the new technique also further proves it to be of high speed, high throughput, high accuracy, high reliability, and high informatization.
Code of Federal Regulations, 2011 CFR
2011-01-01
... signed by an officer of the air carrier with the requisite authority over the collection of data and preparation of reports to ensure the validity and accuracy of the reported data. [53 FR 46294, Nov. 16, 1988...
Code of Federal Regulations, 2010 CFR
2010-01-01
... signed by an officer of the air carrier with the requisite authority over the collection of data and preparation of reports to ensure the validity and accuracy of the reported data. [53 FR 46294, Nov. 16, 1988...
A high-voltage supply used on miniaturized RLG
NASA Astrophysics Data System (ADS)
Miao, Zhifei; Fan, Mingming; Wang, Yuepeng; Yin, Yan; Wang, Dongmei
2016-01-01
A high voltage power supply used in laser gyro is proposed in this paper. The power supply which uses a single DC 15v input and fly-back topology is adopted in the main circuit. The output of the power supply achieve high to 3.3kv voltage in order to light the RLG. The PFM control method is adopted to realize the rapid switching between the high voltage state and the maintain state. The resonant chip L6565 is used to achieve the zero voltage switching(ZVS), so the consumption is reduced and the power efficiency is improved more than 80%. A special circuit is presented in the control portion to ensure symmetry of the two RLG's arms current. The measured current accuracy is higher than 5‰ and the current symmetry of the two RLG's arms up to 99.2%.
Retrieval of high-spectral-resolution lidar for atmospheric aerosol optical properties profiling
NASA Astrophysics Data System (ADS)
Liu, Dong; Luo, Jing; Yang, Yongying; Cheng, Zhongtao; Zhang, Yupeng; Zhou, Yudi; Duan, Lulin; Su, Lin
2015-10-01
High-spectral-resolution lidars (HSRLs) are increasingly being developed for atmospheric aerosol remote sensing applications due to the straightforward and independent retrieval of aerosol optical properties without reliance on assumptions about lidar ratio. In HSRL technique, spectral discrimination between scattering from molecules and aerosol particles is one of the most critical processes, which needs to be accomplished by means of a narrowband spectroscopic filter. To ensure a high retrieval accuracy of an HSRL system, the high-quality design of its spectral discrimination filter should be made. This paper reviews the available algorithms that were proposed for HSRLs and makes a general accuracy analysis of the HSRL technique focused on the spectral discrimination, in order to provide heuristic guidelines for the reasonable design of the spectral discrimination filter. We introduce a theoretical model for retrieval error evaluation of an HSRL instrument with general three-channel configuration. Monte Carlo (MC) simulations are performed to validate the correctness of the theoretical model. Results from both the model and MC simulations agree very well, and they illustrate one important, although not well realized fact: a large molecular transmittance and a large spectral discrimination ratio (SDR, i.e., ratio of the molecular transmittance to the aerosol transmittance) are beneficial t o promote the retrieval accuracy. The application of the conclusions obtained in this paper in the designing of a new type of spectroscopic filter, that is, the field-widened Michelson interferometer, is illustrated in detail. These works are with certain universality and expected to be useful guidelines for HSRL community, especially when choosing or designing the spectral discrimination filter.
Comparison of Phase-Based 3D Near-Field Source Localization Techniques for UHF RFID.
Parr, Andreas; Miesen, Robert; Vossiek, Martin
2016-06-25
In this paper, we present multiple techniques for phase-based narrowband backscatter tag localization in three-dimensional space with planar antenna arrays or synthetic apertures. Beamformer and MUSIC localization algorithms, known from near-field source localization and direction-of-arrival estimation, are applied to the 3D backscatter scenario and their performance in terms of localization accuracy is evaluated. We discuss the impact of different transceiver modes known from the literature, which evaluate different send and receive antenna path combinations for a single localization, as in multiple input multiple output (MIMO) systems. Furthermore, we propose a new Singledimensional-MIMO (S-MIMO) transceiver mode, which is especially suited for use with mobile robot systems. Monte-Carlo simulations based on a realistic multipath error model ensure spatial correlation of the simulated signals, and serve to critically appraise the accuracies of the different localization approaches. A synthetic uniform rectangular array created by a robotic arm is used to evaluate selected localization techniques. We use an Ultra High Frequency (UHF) Radiofrequency Identification (RFID) setup to compare measurements with the theory and simulation. The results show how a mean localization accuracy of less than 30 cm can be reached in an indoor environment. Further simulations demonstrate how the distance between aperture and tag affects the localization accuracy and how the size and grid spacing of the rectangular array need to be adapted to improve the localization accuracy down to orders of magnitude in the centimeter range, and to maximize array efficiency in terms of localization accuracy per number of elements.
NASA Astrophysics Data System (ADS)
Galavís, M. E.; Mendoza, C.; Zeippen, C. J.
1998-12-01
Since te[Burgess et al. (1997)]{bur97} have recently questioned the accuracy of the effective collision strength calculated in the IRON Project for the electron impact excitation of the 3ssp23p sp4 \\ sp1 D -sp1 S quadrupole transition in Ar iii, an extended R-matrix calculation has been performed for this transition. The original 24-state target model was maintained, but the energy regime was increased to 100 Ryd. It is shown that in order to ensure convergence of the partial wave expansion at such energies, it is necessary to take into account partial collision strengths up to L=30 and to ``top-up'' with a geometric series procedure. By comparing effective collision strengths, it is found that the differences from the original calculation are not greater than 25% around the upper end of the common temperature range and that they are much smaller than 20% over most of it. This is consistent with the accuracy rating (20%) previously assigned to transitions in this low ionisation system. Also the present high-temperature limit agrees fairly well (15%) with the Coulomb-Born limit estimated by Burgess et al., thus confirming our previous accuracy rating. It appears that Burgess et al., in their data assessment, have overextended the low-energy behaviour of our reduced effective collision strength to obtain an extrapolated high-temperature limit that appeared to be in error by a factor of 2.
NASA Astrophysics Data System (ADS)
Mao, Heng; Wang, Xiao; Zhao, Dazun
2007-07-01
Baseline algorithm, as a tool in wavefront sensing (WFS), incorporates the phase-diverse phase retrieval (PDPR) method with hybrid-unwrapping approach to ensure a unique pupil phase estimate with high WFS accuracy even in the case of high dynamic range aberration, as long as the pupil shape is of a convex set. However, for a complicated pupil, such as that in obstructed pupil optics, the said unwrapping approach would fail owing to the fake values at points located in obstructed areas of the pupil. Thus a modified unwrapping approach that can minimize the negative effects of the obstructed areas is proposed. Simulations have shown the validity of this unwrapping approach when it is embedded in Baseline algorithm.
NASA Technical Reports Server (NTRS)
Shu, Chi-Wang
1998-01-01
This project is about the development of high order, non-oscillatory type schemes for computational fluid dynamics. Algorithm analysis, implementation, and applications are performed. Collaborations with NASA scientists have been carried out to ensure that the research is relevant to NASA objectives. The combination of ENO finite difference method with spectral method in two space dimension is considered, jointly with Cai [3]. The resulting scheme behaves nicely for the two dimensional test problems with or without shocks. Jointly with Cai and Gottlieb, we have also considered one-sided filters for spectral approximations to discontinuous functions [2]. We proved theoretically the existence of filters to recover spectral accuracy up to the discontinuity. We also constructed such filters for practical calculations.
NASA Astrophysics Data System (ADS)
Moroni, Giovanni; Syam, Wahyudin P.; Petrò, Stefano
2014-08-01
Product quality is a main concern today in manufacturing; it drives competition between companies. To ensure high quality, a dimensional inspection to verify the geometric properties of a product must be carried out. High-speed non-contact scanners help with this task, by both speeding up acquisition speed and increasing accuracy through a more complete description of the surface. The algorithms for the management of the measurement data play a critical role in ensuring both the measurement accuracy and speed of the device. One of the most fundamental parts of the algorithm is the procedure for fitting the substitute geometry to a cloud of points. This article addresses this challenge. Three relevant geometries are selected as case studies: a non-linear least-squares fitting of a circle, sphere and cylinder. These geometries are chosen in consideration of their common use in practice; for example the sphere is often adopted as a reference artifact for performance verification of a coordinate measuring machine (CMM) and a cylinder is the most relevant geometry for a pin-hole relation as an assembly feature to construct a complete functioning product. In this article, an improvement of the initial point guess for the Levenberg-Marquardt (LM) algorithm by employing a chaos optimization (CO) method is proposed. This causes a performance improvement in the optimization of a non-linear function fitting the three geometries. The results show that, with this combination, a higher quality of fitting results a smaller norm of the residuals can be obtained while preserving the computational cost. Fitting an ‘incomplete-point-cloud’, which is a situation where the point cloud does not cover a complete feature e.g. from half of the total part surface, is also investigated. Finally, a case study of fitting a hemisphere is presented.
Self-Calibrating Respiratory-Flowmeter Combination
NASA Technical Reports Server (NTRS)
Westenskow, Dwayne R.; Orr, Joseph A.
1990-01-01
Dual flowmeters ensure accuracy over full range of human respiratory flow rates. System for measurement of respiratory flow employs two flowmeters; one compensates for deficiencies of other. Combination yields easily calibrated system accurate over wide range of gas flow.
NASA Astrophysics Data System (ADS)
de Laborderie, J.; Duchaine, F.; Gicquel, L.; Vermorel, O.; Wang, G.; Moreau, S.
2018-06-01
Large-Eddy Simulation (LES) is recognized as a promising method for high-fidelity flow predictions in turbomachinery applications. The presented approach consists of the coupling of several instances of the same LES unstructured solver through an overset grid method. A high-order interpolation, implemented within this coupling method, is introduced and evaluated on several test cases. It is shown to be third order accurate, to preserve the accuracy of various second and third order convective schemes and to ensure the continuity of diffusive fluxes and subgrid scale tensors even in detrimental interface configurations. In this analysis, three types of spurious waves generated at the interface are identified. They are significantly reduced by the high-order interpolation at the interface. The latter having the same cost as the original lower order method, the high-order overset grid method appears as a promising alternative to be used in all the applications.
Ionospheric Mapping Software Ensures Accuracy of Pilots GPS
NASA Technical Reports Server (NTRS)
2015-01-01
IonoSTAGE and SuperTruth software are part of a suite created at the Jet Propulsion Laboratory to enable the Federal Aviation Administration's Wide Area Augmentation System, which provides pinpoint accuracy in aircraft GPS units. The system, used by more than 73,000 planes, facilitates landings under adverse conditions at small airports. In 2013, IonoSTAGE and SuperTruth found their first commercial license when NEC, based in Japan, with US headquarters in Irving, Texas, licensed the entire suite.
Guidance to Achieve Accurate Aggregate Quantitation in Biopharmaceuticals by SV-AUC.
Arthur, Kelly K; Kendrick, Brent S; Gabrielson, John P
2015-01-01
The levels and types of aggregates present in protein biopharmaceuticals must be assessed during all stages of product development, manufacturing, and storage of the finished product. Routine monitoring of aggregate levels in biopharmaceuticals is typically achieved by size exclusion chromatography (SEC) due to its high precision, speed, robustness, and simplicity to operate. However, SEC is error prone and requires careful method development to ensure accuracy of reported aggregate levels. Sedimentation velocity analytical ultracentrifugation (SV-AUC) is an orthogonal technique that can be used to measure protein aggregation without many of the potential inaccuracies of SEC. In this chapter, we discuss applications of SV-AUC during biopharmaceutical development and how characteristics of the technique make it better suited for some applications than others. We then discuss the elements of a comprehensive analytical control strategy for SV-AUC. Successful implementation of these analytical control elements ensures that SV-AUC provides continued value over the long time frames necessary to bring biopharmaceuticals to market. © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jiang, Jiamin; Younis, Rami M.
2017-06-01
The first-order methods commonly employed in reservoir simulation for computing the convective fluxes introduce excessive numerical diffusion leading to severe smoothing of displacement fronts. We present a fully-implicit cell-centered finite-volume (CCFV) framework that can achieve second-order spatial accuracy on smooth solutions, while at the same time maintain robustness and nonlinear convergence performance. A novel multislope MUSCL method is proposed to construct the required values at edge centroids in a straightforward and effective way by taking advantage of the triangular mesh geometry. In contrast to the monoslope methods in which a unique limited gradient is used, the multislope concept constructs specific scalar slopes for the interpolations on each edge of a given element. Through the edge centroids, the numerical diffusion caused by mesh skewness is reduced, and optimal second order accuracy can be achieved. Moreover, an improved smooth flux-limiter is introduced to ensure monotonicity on non-uniform meshes. The flux-limiter provides high accuracy without degrading nonlinear convergence performance. The CCFV framework is adapted to accommodate a lower-dimensional discrete fracture-matrix (DFM) model. Several numerical tests with discrete fractured system are carried out to demonstrate the efficiency and robustness of the numerical model.
NASA Technical Reports Server (NTRS)
Smith, N. S. A.; Frolov, S. M.; Bowman, C. T.
1996-01-01
Two types of mixing sub-models are evaluated in connection with a joint-scalar probability density function method for turbulent nonpremixed combustion. Model calculations are made and compared to simulation results for homogeneously distributed methane-air reaction zones mixing and reacting in decaying turbulence within a two-dimensional enclosed domain. The comparison is arranged to ensure that both the simulation and model calculations a) make use of exactly the same chemical mechanism, b) do not involve non-unity Lewis number transport of species, and c) are free from radiation loss. The modified Curl mixing sub-model was found to provide superior predictive accuracy over the simple relaxation-to-mean submodel in the case studied. Accuracy to within 10-20% was found for global means of major species and temperature; however, nitric oxide prediction accuracy was lower and highly dependent on the choice of mixing sub-model. Both mixing submodels were found to produce non-physical mixing behavior for mixture fractions removed from the immediate reaction zone. A suggestion for a further modified Curl mixing sub-model is made in connection with earlier work done in the field.
NASA Astrophysics Data System (ADS)
Sokolova, N.; Morrison, A.; Haakonsen, T. A.
2015-04-01
Recent advancement of land-based mobile mapping enables rapid and cost-effective collection of highquality road related spatial information. Mobile Mapping Systems (MMS) can provide spatial information with subdecimeter accuracy in nominal operation environments. However, performance in challenging environments such as tunnels is not well characterized. The Norwegian Public Roads Administration (NPRA) manages the country's public road network and its infrastructure, a large segment of which is represented by road tunnels (there are about 1 000 road tunnels in Norway with a combined length of 800 km). In order to adopt mobile mapping technology for streamlining road network and infrastructure management and maintenance tasks, it is important to ensure that the technology is mature enough to meet existing requirements for object positioning accuracy in all types of environments, and provide homogeneous accuracy over the mapping perimeter. This paper presents results of a testing campaign performed within a project funded by the NPRA as a part of SMarter road traffic with Intelligent Transport Systems (ITS) (SMITS) program. The testing campaign objective was performance evaluation of high end commercial MMSs for inventory of public areas, focusing on Global Navigation Satellite System (GNSS) signal degraded environments.
Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique
2016-01-01
High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications. PMID:26905719
Xue, Y.; Liu, S.; Hu, Y.; Yang, J.; Chen, Q.
2007-01-01
To improve the accuracy in prediction, Genetic Algorithm based Adaptive Neural Network Ensemble (GA-ANNE) is presented. Intersections are allowed between different training sets based on the fuzzy clustering analysis, which ensures the diversity as well as the accuracy of individual Neural Networks (NNs). Moreover, to improve the accuracy of the adaptive weights of individual NNs, GA is used to optimize the cluster centers. Empirical results in predicting carbon flux of Duke Forest reveal that GA-ANNE can predict the carbon flux more accurately than Radial Basis Function Neural Network (RBFNN), Bagging NN ensemble, and ANNE. ?? 2007 IEEE.
Bridges, Daniel J; Pollard, Derek; Winters, Anna M; Winters, Benjamin; Sikaala, Chadwick; Renn, Silvia; Larsen, David A
2018-02-23
Indoor residual spraying (IRS) is a key tool in the fight to control, eliminate and ultimately eradicate malaria. IRS protection is based on a communal effect such that an individual's protection primarily relies on the community-level coverage of IRS with limited protection being provided by household-level coverage. To ensure a communal effect is achieved through IRS, achieving high and uniform community-level coverage should be the ultimate priority of an IRS campaign. Ensuring high community-level coverage of IRS in malaria-endemic areas is challenging given the lack of information available about both the location and number of households needing IRS in any given area. A process termed 'mSpray' has been developed and implemented and involves use of satellite imagery for enumeration for planning IRS and a mobile application to guide IRS implementation. This study assessed (1) the accuracy of the satellite enumeration and (2) how various degrees of spatial aid provided through the mSpray process affected community-level IRS coverage during the 2015 spray campaign in Zambia. A 2-stage sampling process was applied to assess accuracy of satellite enumeration to determine number and location of sprayable structures. Results indicated an overall sensitivity of 94% for satellite enumeration compared to finding structures on the ground. After adjusting for structure size, roof, and wall type, households in Nchelenge District where all types of satellite-based spatial aids (paper-based maps plus use of the mobile mSpray application) were used were more likely to have received IRS than Kasama district where maps used were not based on satellite enumeration. The probability of a household being sprayed in Nchelenge district where tablet-based maps were used, did not differ statistically from that of a household in Samfya District, where detailed paper-based spatial aids based on satellite enumeration were provided. IRS coverage from the 2015 spray season benefited from the use of spatial aids based upon satellite enumeration. These spatial aids can guide costly IRS planning and implementation leading to attainment of higher spatial coverage, and likely improve disease impact.
NASA Astrophysics Data System (ADS)
Maity, Arnab; Padhi, Radhakant; Mallaram, Sanjeev; Mallikarjuna Rao, G.; Manickavasagam, M.
2016-10-01
A new nonlinear optimal and explicit guidance law is presented in this paper for launch vehicles propelled by solid motors. It can ensure very high terminal precision despite not having the exact knowledge of the thrust-time curve apriori. This was motivated from using it for a carrier launch vehicle in a hypersonic mission, which demands an extremely narrow terminal accuracy window for the launch vehicle for successful initiation of operation of the hypersonic vehicle. The proposed explicit guidance scheme, which computes the optimal guidance command online, ensures the required stringent final conditions with high precision at the injection point. A key feature of the proposed guidance law is an innovative extension of the recently developed model predictive static programming guidance with flexible final time. A penalty function approach is also followed to meet the input and output inequality constraints throughout the vehicle trajectory. In this paper, the guidance law has been successfully validated from nonlinear six degree-of-freedom simulation studies by designing an inner-loop autopilot as well, which enhances confidence of its usefulness significantly. In addition to excellent nominal results, the proposed guidance has been found to have good robustness for perturbed cases as well.
Application of Intra-Oral Dental Scanners in the Digital Workflow of Implantology
van der Meer, Wicher J.; Andriessen, Frank S.; Wismeijer, Daniel; Ren, Yijin
2012-01-01
Intra-oral scanners will play a central role in digital dentistry in the near future. In this study the accuracy of three intra-oral scanners was compared. Materials and methods: A master model made of stone was fitted with three high precision manufactured PEEK cylinders and scanned with three intra-oral scanners: the CEREC (Sirona), the iTero (Cadent) and the Lava COS (3M). In software the digital files were imported and the distance between the centres of the cylinders and the angulation between the cylinders was assessed. These values were compared to the measurements made on a high accuracy 3D scan of the master model. Results: The distance errors were the smallest and most consistent for the Lava COS. The distance errors for the Cerec were the largest and least consistent. All the angulation errors were small. Conclusions: The Lava COS in combination with a high accuracy scanning protocol resulted in the smallest and most consistent errors of all three scanners tested when considering mean distance errors in full arch impressions both in absolute values and in consistency for both measured distances. For the mean angulation errors, the Lava COS had the smallest errors between cylinders 1–2 and the largest errors between cylinders 1–3, although the absolute difference with the smallest mean value (iTero) was very small (0,0529°). An expected increase in distance and/or angular errors over the length of the arch due to an accumulation of registration errors of the patched 3D surfaces could be observed in this study design, but the effects were statistically not significant. Clinical relevance For making impressions of implant cases for digital workflows, the most accurate scanner with the scanning protocol that will ensure the most accurate digital impression should be used. In our study model that was the Lava COS with the high accuracy scanning protocol. PMID:22937030
NASA Technical Reports Server (NTRS)
Adell, Philippe C.; Mojarradi, Mohammad; DelCastillo, Linda Y.; Vo, Tuan A.
2011-01-01
A paper discusses the successful development of a miniaturized radiation hardened high-voltage switching module operating at 2.5 kV suitable for space application. The high-voltage architecture was designed, fabricated, and tested using a commercial process that uses a unique combination of 0.25 micrometer CMOS (complementary metal oxide semiconductor) transistors and high-voltage lateral DMOS (diffusion metal oxide semiconductor) device with high breakdown voltage (greater than 650 V). The high-voltage requirements are achieved by stacking a number of DMOS devices within one module, while two modules can be placed in series to achieve higher voltages. Besides the high-voltage requirements, a second generation prototype is currently being developed to provide improved switching capabilities (rise time and fall time for full range of target voltages and currents), the ability to scale the output voltage to a desired value with good accuracy (few percent) up to 10 kV, to cover a wide range of high-voltage applications. In addition, to ensure miniaturization, long life, and high reliability, the assemblies will require intensive high-voltage electrostatic modeling (optimized E-field distribution throughout the module) to complete the proposed packaging approach and test the applicability of using advanced materials in a space-like environment (temperature and pressure) to help prevent potential arcing and corona due to high field regions. Finally, a single-event effect evaluation would have to be performed and single-event mitigation methods implemented at the design and system level or developed to ensure complete radiation hardness of the module.
Kusumaningrum, Dewi; Lee, Hoonsoo; Lohumi, Santosh; Mo, Changyeun; Kim, Moon S; Cho, Byoung-Kwan
2018-03-01
The viability of seeds is important for determining their quality. A high-quality seed is one that has a high capability of germination that is necessary to ensure high productivity. Hence, developing technology for the detection of seed viability is a high priority in agriculture. Fourier transform near-infrared (FT-NIR) spectroscopy is one of the most popular devices among other vibrational spectroscopies. This study aims to use FT-NIR spectroscopy to determine the viability of soybean seeds. Viable and artificial ageing seeds as non-viable soybeans were used in this research. The FT-NIR spectra of soybean seeds were collected and analysed using a partial least-squares discriminant analysis (PLS-DA) to classify viable and non-viable soybean seeds. Moreover, the variable importance in projection (VIP) method for variable selection combined with the PLS-DA was employed. The most effective wavelengths were selected by the VIP method, which selected 146 optimal variables from the full set of 1557 variables. The results demonstrated that the FT-NIR spectral analysis with the PLS-DA method that uses all variables or the selected variables showed good performance based on the high value of prediction accuracy for soybean viability with an accuracy close to 100%. Hence, FT-NIR techniques with a chemometric analysis have the potential for rapidly measuring soybean seed viability. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
A novel adaptive finite time controller for bilateral teleoperation system
NASA Astrophysics Data System (ADS)
Wang, Ziwei; Chen, Zhang; Liang, Bin; Zhang, Bo
2018-03-01
Most bilateral teleoperation researches focus on the system stability within time-delays. However, practical teleoperation tasks require high performances besides system stability, such as convergence rate and accuracy. This paper investigates bilateral teleoperation controller design with transient performances. To ensure the transient performances and system stability simultaneously, an adaptive non-singular fast terminal mode controller is proposed to achieve practical finite-time stability considering system uncertainties and time delays. In addition, a novel switching scheme is introduced, in which way the singularity problem of conventional terminal sliding manifold is avoided. Finally, numerical simulations demonstrate the effectiveness and validity of the proposed method.
Ultra-low power high-dynamic range color pixel embedding RGB to r-g chromaticity transformation
NASA Astrophysics Data System (ADS)
Lecca, Michela; Gasparini, Leonardo; Gottardi, Massimo
2014-05-01
This work describes a novel color pixel topology that converts the three chromatic components from the standard RGB space into the normalized r-g chromaticity space. This conversion is implemented with high-dynamic range and with no dc power consumption, and the auto-exposure capability of the sensor ensures to capture a high quality chromatic signal, even in presence of very bright illuminants or in the darkness. The pixel is intended to become the basic building block of a CMOS color vision sensor, targeted to ultra-low power applications for mobile devices, such as human machine interfaces, gesture recognition, face detection. The experiments show that significant improvements of the proposed pixel with respect to standard cameras in terms of energy saving and accuracy on data acquisition. An application to skin color-based description is presented.
Research on Improved Depth Belief Network-Based Prediction of Cardiovascular Diseases
Zhang, Hongpo
2018-01-01
Quantitative analysis and prediction can help to reduce the risk of cardiovascular disease. Quantitative prediction based on traditional model has low accuracy. The variance of model prediction based on shallow neural network is larger. In this paper, cardiovascular disease prediction model based on improved deep belief network (DBN) is proposed. Using the reconstruction error, the network depth is determined independently, and unsupervised training and supervised optimization are combined. It ensures the accuracy of model prediction while guaranteeing stability. Thirty experiments were performed independently on the Statlog (Heart) and Heart Disease Database data sets in the UCI database. Experimental results showed that the mean of prediction accuracy was 91.26% and 89.78%, respectively. The variance of prediction accuracy was 5.78 and 4.46, respectively. PMID:29854369
Spectrographs and Large Telescopes: A Study of Instrumentation
NASA Astrophysics Data System (ADS)
Fica, Haley Diane; Crane, Jeffrey D.; Uomoto, Alan K.; Hare, Tyson
2017-01-01
It is a truth universally acknowledged, that a telescope in possession of a large aperture, must be in want of a high resolution spectrograph. Subsystems of these instruments require testing and upgrading to ensure that they can continue to be scientifically productive and usher in a new era of astronomical research. The Planet Finder Spectrograph (PFS) and Magellan Inamori Kyocera Echelle (MIKE), both on the Magellan II Clay telescope at Las Campanas Observatory, and the Giant Magellan Telescope (GMT) Consortium Large Earth Finder (G-CLEF) are examples of such instruments. Bluer flat field lamps were designed for PFS and MIKE to replace lamps no longer available in order to ensure continued, efficient functionality. These newly designed lamps will result in better flat fielding and calibration of data, and thus result in increased reduction of instrument noise. When it is built and installed in 2022, G-CLEF will be be fed by a tertiary mirror on the GMT. Stepper motors attached to the back of this mirror will be used to correct misalignments in the optical relay system. These motors were characterized to ensure that they function as expected to an accuracy of a few microns. These projects incorporate several key aspects of astronomical instrumentation: designing, building, and testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marutzky, Sam; Farnham, Irene
The purpose of the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan) is to provide a comprehensive, integrated approach for collecting and analyzing groundwater samples to meet the needs and objectives of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) Underground Test Area (UGTA) Activity. Implementation of this Plan will provide high-quality data required by the UGTA Activity for ensuring public protection in an efficient and cost-effective manner. The Plan is designed to ensure compliance with the UGTA Quality Assurance Plan (QAP). The Plan’s scope comprises sample collectionmore » and analysis requirements relevant to assessing the extent of groundwater contamination from underground nuclear testing. This Plan identifies locations to be sampled by corrective action unit (CAU) and location type, sampling frequencies, sample collection methodologies, and the constituents to be analyzed. In addition, the Plan defines data collection criteria such as well-purging requirements, detection levels, and accuracy requirements; identifies reporting and data management requirements; and provides a process to ensure coordination between NNSS groundwater sampling programs for sampling of interest to UGTA. This Plan does not address compliance with requirements for wells that supply the NNSS public water system or wells involved in a permitted activity.« less
Sensor Systems for Vehicle Environment Perception in a Highway Intelligent Space System
Tang, Xiaofeng; Gao, Feng; Xu, Guoyan; Ding, Nenggen; Cai, Yao; Ma, Mingming; Liu, Jianxing
2014-01-01
A Highway Intelligent Space System (HISS) is proposed to study vehicle environment perception in this paper. The nature of HISS is that a space sensors system using laser, ultrasonic or radar sensors are installed in a highway environment and communication technology is used to realize the information exchange between the HISS server and vehicles, which provides vehicles with the surrounding road information. Considering the high-speed feature of vehicles on highways, when vehicles will be passing a road ahead that is prone to accidents, the vehicle driving state should be predicted to ensure drivers have road environment perception information in advance, thereby ensuring vehicle driving safety and stability. In order to verify the accuracy and feasibility of the HISS, a traditional vehicle-mounted sensor system for environment perception is used to obtain the relative driving state. Furthermore, an inter-vehicle dynamics model is built and model predictive control approach is used to predict the driving state in the following period. Finally, the simulation results shows that using the HISS for environment perception can arrive at the same results detected by a traditional vehicle-mounted sensors system. Meanwhile, we can further draw the conclusion that using HISS to realize vehicle environment perception can ensure system stability, thereby demonstrating the method's feasibility. PMID:24834907
Brain collection, standardized neuropathologic assessment, and comorbidity in ADNI participants
Franklin, Erin E.; Perrin, Richard J.; Vincent, Benjamin; Baxter, Michael; Morris, John C.; Cairns, Nigel J.
2015-01-01
Introduction The Alzheimer’s Disease Neuroimaging Initiative Neuropathology Core (ADNI-NPC) facilitates brain donation, ensures standardized neuropathologic assessments, and maintains a tissue resource for research. Methods The ADNI-NPC coordinates with performance sites to promote autopsy consent, facilitate tissue collection and autopsy administration, and arrange sample delivery to the NPC, for assessment using NIA-AA neuropathologic diagnostic criteria. Results The ADNI-NPC has obtained 45 participant specimens and neuropathologic assessments have been completed in 36 to date. Challenges in obtaining consent at some sites have limited the voluntary autopsy rate to 58%. Among assessed cases, clinical diagnostic accuracy for Alzheimer disease (AD) is 97%; however, 58% show neuropathologic comorbidities. Discussion Challenges facing autopsy consent and coordination are largely resource-related. The neuropathologic assessments indicate that ADNI’s clinical diagnostic accuracy for AD is high; however, many AD cases have comorbidities that may impact the clinical presentation, course, and imaging and biomarker results. These neuropathologic data permit multimodal and genetic studies of these comorbidities to improve diagnosis and provide etiologic insights. PMID:26194314
NASA Astrophysics Data System (ADS)
Wijesingha, J. S. J.; Deshapriya, N. L.; Samarakoon, L.
2015-04-01
Billions of people in the world depend on rice as a staple food and as an income-generating crop. Asia is the leader in rice cultivation and it is necessary to maintain an up-to-date rice-related database to ensure food security as well as economic development. This study investigates general applicability of high temporal resolution Moderate Resolution Imaging Spectroradiometer (MODIS) 250m gridded vegetation product for monitoring rice crop growth, mapping rice crop acreage and analyzing crop yield, at the province-level. The MODIS 250m Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) time series data, field data and crop calendar information were utilized in this research in Sa Kaeo Province, Thailand. The following methodology was used: (1) data pre-processing and rice plant growth analysis using Vegetation Indices (VI) (2) extraction of rice acreage and start-of-season dates from VI time series data (3) accuracy assessment, and (4) yield analysis with MODIS VI. The results show a direct relationship between rice plant height and MODIS VI. The crop calendar information and the smoothed NDVI time series with Whittaker Smoother gave high rice acreage estimation (with 86% area accuracy and 75% classification accuracy). Point level yield analysis showed that the MODIS EVI is highly correlated with rice yield and yield prediction using maximum EVI in the rice cycle predicted yield with an average prediction error 4.2%. This study shows the immense potential of MODIS gridded vegetation product for keeping an up-to-date Geographic Information System of rice cultivation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nierman, William C.
At TIGR, the human Bacterial Artificial Chromosome (BAC) end sequencing and trimming were with an overall sequencing success rate of 65%. CalTech human BAC libraries A, B, C and D as well as Roswell Park Cancer Institute's library RPCI-11 were used. To date, we have generated >300,000 end sequences from >186,000 human BAC clones with an average read length {approx}460 bp for a total of 141 Mb covering {approx}4.7% of the genome. Over sixty percent of the clones have BAC end sequences (BESs) from both ends representing over five-fold coverage of the genome by the paired-end clones. The average phredmore » Q20 length is {approx}400 bp. This high accuracy makes our BESs match the human finished sequences with an average identity of 99% and a match length of 450 bp, and a frequency of one match per 12.8 kb contig sequence. Our sample tracking has ensured a clone tracking accuracy of >90%, which gives researchers a high confidence in (1) retrieving the right clone from the BA C libraries based on the sequence matches; and (2) building a minimum tiling path of sequence-ready clones across the genome and genome assembly scaffolds.« less
Bartram, Jack; Mountjoy, Edward; Brooks, Tony; Hancock, Jeremy; Williamson, Helen; Wright, Gary; Moppett, John; Goulden, Nick; Hubank, Mike
2016-07-01
High-throughput sequencing (HTS) (next-generation sequencing) of the rearranged Ig and T-cell receptor genes promises to be less expensive and more sensitive than current methods of monitoring minimal residual disease (MRD) in patients with acute lymphoblastic leukemia. However, the adoption of new approaches by clinical laboratories requires careful evaluation of all potential sources of error and the development of strategies to ensure the highest accuracy. Timely and efficient clinical use of HTS platforms will depend on combining multiple samples (multiplexing) in each sequencing run. Here we examine the Ig heavy-chain gene HTS on the Illumina MiSeq platform for MRD. We identify errors associated with multiplexing that could potentially impact the accuracy of MRD analysis. We optimize a strategy that combines high-purity, sequence-optimized oligonucleotides, dual indexing, and an error-aware demultiplexing approach to minimize errors and maximize sensitivity. We present a probability-based, demultiplexing pipeline Error-Aware Demultiplexer that is suitable for all MiSeq strategies and accurately assigns samples to the correct identifier without excessive loss of data. Finally, using controls quantified by digital PCR, we show that HTS-MRD can accurately detect as few as 1 in 10(6) copies of specific leukemic MRD. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Wang, Feng-Fei; Luo, A-Li; Zhao, Yong-Heng
2014-02-01
The radial velocity of the star is very important for the study of the dynamics structure and chemistry evolution of the Milky Way, is also an useful tool for looking for variable or special objects. In the present work, we focus on calculating the radial velocity of different spectral types of low-resolution stellar spectra by adopting a template matching method, so as to provide effective and reliable reference to the different aspects of scientific research We choose high signal-to-noise ratio (SNR) spectra of different spectral type stellar from the Sloan Digital Sky Survey (SDSS), and add different noise to simulate the stellar spectra with different SNR. Then we obtain theradial velocity measurement accuracy of different spectral type stellar spectra at different SNR by employing a template matching method. Meanwhile, the radial velocity measurement accuracy of white dwarf stars is analyzed as well. We concluded that the accuracy of radial velocity measurements of early-type stars is much higher than late-type ones. For example, the 1-sigma standard error of radial velocity measurements of A-type stars is 5-8 times as large as K-type and M-type stars. We discuss the reason and suggest that the very narrow lines of late-type stars ensure the accuracy of measurement of radial velocities, while the early-type stars with very wide Balmer lines, such as A-type stars, become sensitive to noise and obtain low accuracy of radial velocities. For the spectra of white dwarfs stars, the standard error of radial velocity measurement could be over 50 km x s(-1) because of their extremely wide Balmer lines. The above conclusion will provide a good reference for stellar scientific study.
Xiao, Bo; Huang, Chewei; Imel, Zac E; Atkins, David C; Georgiou, Panayiotis; Narayanan, Shrikanth S
2016-04-01
Scaling up psychotherapy services such as for addiction counseling is a critical societal need. One challenge is ensuring quality of therapy, due to the heavy cost of manual observational assessment. This work proposes a speech technology-based system to automate the assessment of therapist empathy-a key therapy quality index-from audio recordings of the psychotherapy interactions. We designed a speech processing system that includes voice activity detection and diarization modules, and an automatic speech recognizer plus a speaker role matching module to extract the therapist's language cues. We employed Maximum Entropy models, Maximum Likelihood language models, and a Lattice Rescoring method to characterize high vs. low empathic language. We estimated therapy-session level empathy codes using utterance level evidence obtained from these models. Our experiments showed that the fully automated system achieved a correlation of 0.643 between expert annotated empathy codes and machine-derived estimations, and an accuracy of 81% in classifying high vs. low empathy, in comparison to a 0.721 correlation and 86% accuracy in the oracle setting using manual transcripts. The results show that the system provides useful information that can contribute to automatic quality insurance and therapist training.
Xiao, Bo; Huang, Chewei; Imel, Zac E.; Atkins, David C.; Georgiou, Panayiotis; Narayanan, Shrikanth S.
2016-01-01
Scaling up psychotherapy services such as for addiction counseling is a critical societal need. One challenge is ensuring quality of therapy, due to the heavy cost of manual observational assessment. This work proposes a speech technology-based system to automate the assessment of therapist empathy—a key therapy quality index—from audio recordings of the psychotherapy interactions. We designed a speech processing system that includes voice activity detection and diarization modules, and an automatic speech recognizer plus a speaker role matching module to extract the therapist's language cues. We employed Maximum Entropy models, Maximum Likelihood language models, and a Lattice Rescoring method to characterize high vs. low empathic language. We estimated therapy-session level empathy codes using utterance level evidence obtained from these models. Our experiments showed that the fully automated system achieved a correlation of 0.643 between expert annotated empathy codes and machine-derived estimations, and an accuracy of 81% in classifying high vs. low empathy, in comparison to a 0.721 correlation and 86% accuracy in the oracle setting using manual transcripts. The results show that the system provides useful information that can contribute to automatic quality insurance and therapist training. PMID:28286867
Obtaining high-resolution velocity spectra using weighted semblance
NASA Astrophysics Data System (ADS)
Ebrahimi, Saleh; Kahoo, Amin Roshandel; Porsani, Milton J.; Kalateh, Ali Nejati
2017-02-01
Velocity analysis employs coherency measurement along a hyperbolic or non-hyperbolic trajectory time window to build velocity spectra. Accuracy and resolution are strictly related to the method of coherency measurements. Semblance, the most common coherence measure, has poor resolution velocity which affects one's ability to distinguish and pick distinct peaks. Increase the resolution of the semblance velocity spectra causes the accuracy of estimated velocity for normal moveout correction and stacking is improved. The low resolution of semblance spectra depends on its low sensitivity to velocity changes. In this paper, we present a new weighted semblance method that ensures high-resolution velocity spectra. To increase the resolution of semblance spectra, we introduce two weighting functions based on the first to second singular values ratio of the time window and the position of the seismic wavelet in the time window to the semblance equation. We test the method on both synthetic and real field data to compare the resolution of weighted and conventional semblance methods. Numerical examples with synthetic and real seismic data indicate that the new proposed weighted semblance method provides higher resolution than conventional semblance and can separate the reflectors which are mixed in the semblance spectrum.
Automatic Measuring System for Oil Stream Paraffin Deposits Parameters
NASA Astrophysics Data System (ADS)
Kopteva, A. V.; Koptev, V. Yu
2018-03-01
This paper describes a new method for monitoring oil pipelines, as well as a highly efficient and automated paraffin deposit monitoring method. When operating oil pipelines, there is an issue of paraffin, resin and salt deposits on the pipeline walls that come with the oil stream. It ultimately results in frequent transportation suspension to clean or even replace pipes and other equipment, thus shortening operation periods between repairs, creating emergency situations and increasing production expenses, badly affecting environment, damaging ecology and spoil underground water, killing animals, birds etc. Oil spills contaminate rivers, lakes, and ground waters. Oil transportation monitoring issues are still subject for further studying. Thus, there is the need to invent a radically new automated process control and management system, together with measurement means intellectualization. The measurement principle is based on the Lambert-Beer law that describes the dependence between the gamma-radiation frequency and the density together with the linear attenuation coefficient for a substance. Using the measuring system with high accuracy (± 0,2%), one can measure the thickness of paraffin deposits with an absolute accuracy of ± 5 mm, which is sufficient to ensure reliable operation of the pipeline system. Safety is a key advantage, when using the proposed control system.
Zandvakili, Arya; Campbell, Ian; Weirauch, Matthew T.
2018-01-01
Cells use thousands of regulatory sequences to recruit transcription factors (TFs) and produce specific transcriptional outcomes. Since TFs bind degenerate DNA sequences, discriminating functional TF binding sites (TFBSs) from background sequences represents a significant challenge. Here, we show that a Drosophila regulatory element that activates Epidermal Growth Factor signaling requires overlapping, low-affinity TFBSs for competing TFs (Pax2 and Senseless) to ensure cell- and segment-specific activity. Testing available TF binding models for Pax2 and Senseless, however, revealed variable accuracy in predicting such low-affinity TFBSs. To better define parameters that increase accuracy, we developed a method that systematically selects subsets of TFBSs based on predicted affinity to generate hundreds of position-weight matrices (PWMs). Counterintuitively, we found that degenerate PWMs produced from datasets depleted of high-affinity sequences were more accurate in identifying both low- and high-affinity TFBSs for the Pax2 and Senseless TFs. Taken together, these findings reveal how TFBS arrangement can be constrained by competition rather than cooperativity and that degenerate models of TF binding preferences can improve identification of biologically relevant low affinity TFBSs. PMID:29617378
Calibration of the venµs super-spectral camera
NASA Astrophysics Data System (ADS)
Topaz, Jeremy; Sprecher, Tuvia; Tinto, Francesc; Echeto, Pierre; Hagolle, Olivier
2017-11-01
A high-resolution super-spectral camera is being developed by Elbit Systems in Israel for the joint CNES- Israel Space Agency satellite, VENμS (Vegetation and Environment monitoring on a new Micro-Satellite). This camera will have 12 narrow spectral bands in the Visible/NIR region and will give images with 5.3 m resolution from an altitude of 720 km, with an orbit which allows a two-day revisit interval for a number of selected sites distributed over some two-thirds of the earth's surface. The swath width will be 27 km at this altitude. To ensure the high radiometric and geometric accuracy needed to fully exploit such multiple data sampling, careful attention is given in the design to maximize characteristics such as signal-to-noise ratio (SNR), spectral band accuracy, stray light rejection, inter- band pixel-to-pixel registration, etc. For the same reasons, accurate calibration of all the principle characteristics is essential, and this presents some major challenges. The methods planned to achieve the required level of calibration are presented following a brief description of the system design. A fuller description of the system design is given in [2], [3] and [4].
Manikandan, A.; Biplab, Sarkar; David, Perianayagam A.; Holla, R.; Vivek, T. R.; Sujatha, N.
2011-01-01
For high dose rate (HDR) brachytherapy, independent treatment verification is needed to ensure that the treatment is performed as per prescription. This study demonstrates dosimetric quality assurance of the HDR brachytherapy using a commercially available two-dimensional ion chamber array called IMatriXX, which has a detector separation of 0.7619 cm. The reference isodose length, step size, and source dwell positional accuracy were verified. A total of 24 dwell positions, which were verified for positional accuracy gave a total error (systematic and random) of –0.45 mm, with a standard deviation of 1.01 mm and maximum error of 1.8 mm. Using a step size of 5 mm, reference isodose length (the length of 100% isodose line) was verified for single and multiple catheters of same and different source loadings. An error ≤1 mm was measured in 57% of tests analyzed. Step size verification for 2, 3, 4, and 5 cm was performed and 70% of the step size errors were below 1 mm, with maximum of 1.2 mm. The step size ≤1 cm could not be verified by the IMatriXX as it could not resolve the peaks in dose profile. PMID:21897562
Code of Federal Regulations, 2012 CFR
2012-10-01
... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION RESEARCH MISCONDUCT § 689.3... period that an institutional official other than those guilty of misconduct certify the accuracy of... individual or institution to ensure that steps have been taken to prevent repetition of the misconduct. (iii...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION RESEARCH MISCONDUCT § 689.3... period that an institutional official other than those guilty of misconduct certify the accuracy of... individual or institution to ensure that steps have been taken to prevent repetition of the misconduct. (iii...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION RESEARCH MISCONDUCT § 689.3... period that an institutional official other than those guilty of misconduct certify the accuracy of... individual or institution to ensure that steps have been taken to prevent repetition of the misconduct. (iii...
Huang, Yu-Ting; Huang, Chao-Ya; Su, Hsiu-Ya; Ma, Chen-Te
2018-06-01
Ventilator-associated pneumonia (VAP) is a common healthcare-associated infection in the neonatal intensive care unit (NICU). The average VAP infection density was 4.7‰ in our unit between June and August 2015. The results of a status survey indicated that in-service education lacked specialization, leading to inadequate awareness among staffs regarding the proper care of newborns with VAP and a lack of related care guides. This, in turn, resulted in inconsistencies in care measures for newborns with VAP. To improve the accuracy of implementation of preventive measures for VAP among medical staffs and reduce the density of VAP infections in the NICU. Conduct a literature search and adopt medical team resources management methods; establish effective team communication; establish monitoring mechanisms and incentives; establish mandatory in-service specialization education contents and a VAP preventive care guide exclusively for newborns as a reference for medical staffs during care execution; install additional equipment and aids and set reminders to ensure the implementation of VAP preventive measures. The accuracy rate of preventive measure execution by medical staffs improved from 70.1% to 97.9% and the VAP infection density in the NICU decreased from 4.7‰ to 0.52‰. Team integration effectively improved the accuracy of implementation of VAP-prevention measures, reduced the density of VAP infections, enhanced quality of care, and ensured that newborns received care that was more in line with specialization needs.
NASA Astrophysics Data System (ADS)
Wu, Jie; Yan, Quan-sheng; Li, Jian; Hu, Min-yi
2016-04-01
In bridge construction, geometry control is critical to ensure that the final constructed bridge has the consistent shape as design. A common method is by predicting the deflections of the bridge during each construction phase through the associated finite element models. Therefore, the cambers of the bridge during different construction phases can be determined beforehand. These finite element models are mostly based on the design drawings and nominal material properties. However, the accuracy of these bridge models can be large due to significant uncertainties of the actual properties of the materials used in construction. Therefore, the predicted cambers may not be accurate to ensure agreement of bridge geometry with design, especially for long-span bridges. In this paper, an improved geometry control method is described, which incorporates finite element (FE) model updating during the construction process based on measured bridge deflections. A method based on the Kriging model and Latin hypercube sampling is proposed to perform the FE model updating due to its simplicity and efficiency. The proposed method has been applied to a long-span continuous girder concrete bridge during its construction. Results show that the method is effective in reducing construction error and ensuring the accuracy of the geometry of the final constructed bridge.
Ma, Ye; Xie, Shengquan; Zhang, Yanxin
2016-03-01
A patient-specific electromyography (EMG)-driven neuromuscular model (PENm) is developed for the potential use of human-inspired gait rehabilitation robots. The PENm is modified based on the current EMG-driven models by decreasing the calculation time and ensuring good prediction accuracy. To ensure the calculation efficiency, the PENm is simplified into two EMG channels around one joint with minimal physiological parameters. In addition, a dynamic computation model is developed to achieve real-time calculation. To ensure the calculation accuracy, patient-specific muscle kinematics information, such as the musculotendon lengths and the muscle moment arms during the entire gait cycle, are employed based on the patient-specific musculoskeletal model. Moreover, an improved force-length-velocity relationship is implemented to generate accurate muscle forces. Gait analysis data including kinematics, ground reaction forces, and raw EMG signals from six adolescents at three different speeds were used to evaluate the PENm. The simulation results show that the PENm has the potential to predict accurate joint moment in real-time. The design of advanced human-robot interaction control strategies and human-inspired gait rehabilitation robots can benefit from the application of the human internal state provided by the PENm. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mathematics of Information Processing and the Internet
ERIC Educational Resources Information Center
Hart, Eric W.
2010-01-01
The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…
14 CFR 1275.106 - Administrative actions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1275.106 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION RESEARCH MISCONDUCT § 1275... that an institutional official other than those guilty of research misconduct certify the accuracy of... institution to ensure that steps have been taken to prevent repetition of the research misconduct. (3) Group...
14 CFR 1275.106 - Administrative actions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 1275.106 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION RESEARCH MISCONDUCT § 1275... that an institutional official other than those guilty of research misconduct certify the accuracy of... institution to ensure that steps have been taken to prevent repetition of the research misconduct. (3) Group...
14 CFR § 1275.106 - Administrative actions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... § 1275.106 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION RESEARCH MISCONDUCT... that an institutional official other than those guilty of research misconduct certify the accuracy of... institution to ensure that steps have been taken to prevent repetition of the research misconduct. (3) Group...
14 CFR 1275.106 - Administrative actions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 1275.106 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION RESEARCH MISCONDUCT § 1275... that an institutional official other than those guilty of research misconduct certify the accuracy of... institution to ensure that steps have been taken to prevent repetition of the research misconduct. (3) Group...
14 CFR 1275.106 - Administrative actions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 1275.106 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION RESEARCH MISCONDUCT § 1275... that an institutional official other than those guilty of research misconduct certify the accuracy of... institution to ensure that steps have been taken to prevent repetition of the research misconduct. (3) Group...
40 CFR 98.314 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... line dioxide using plant instruments used for accounting purposes including direct measurement weighing... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Ensure the accuracy and currency of the safety data; (ii) Identify factors that affect the priority of... HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HIGHWAY SAFETY HIGHWAY SAFETY IMPROVEMENT PROGRAM... reaching the performance goals identified in § 924.9(a)(3)(ii)(G). (2) Include a process to evaluate the...
76 FR 71432 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-17
... comments received will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it... any correspondence submitted. FRA will summarize comments received in response to this notice in a... functions, including whether the activities will have practical utility; (ii) the accuracy of FRA's...
[Project to enhance bone bank tissue storage and distribution procedures].
Huang, Jui-Chen; Wu, Chiung-Lan; Chen, Chun-Chuan; Chen, Shu-Hua
2011-10-01
Organ and tissue transplantation are now commonly preformed procedures. Improper organ bank handling procedures may increase infection risks. Execution accuracy in terms of tissue storage and distribution at our bone bank was 80%. We thus proposed an execution improvement project to enhance procedures in order to fulfill the intent of donors and ensure recipient safety. This project was designed to raise nurse professionalism, and ensure patient safety through enhanced tissue storage and distribution procedures. Education programs developed for this project focus on teaching standard operating procedures for bone and ligament storage and distribution, bone bank facility maintenance, trouble shooting and solutions, and periodic inspection systems. Cognition of proper storage and distribution procedures rose from 81% to 100%; Execution accuracy also rose from 80% to 100%. The project successfully conveyed concepts essential to the correct execution of organ storage and distribution procedures and proper organ bank facility management. Achieving and maintaining procedural and management standards is crucial to continued organ donations and the recipient safety.
Discuss the testing problems of ultraviolet irradiance meters
NASA Astrophysics Data System (ADS)
Ye, Jun'an; Lin, Fangsheng
2014-09-01
Ultraviolet irradiance meters are widely used in many areas such as medical treatment, epidemic prevention, energy conservation and environment protection, computers, manufacture, electronics, ageing of material and photo-electric effect, for testing ultraviolet irradiance intensity. So the accuracy of value directly affects the sterile control in hospital, treatment, the prevention level of CDC and the control accuracy of curing and aging in manufacturing industry etc. Because the display of ultraviolet irradiance meters is easy to change, in order to ensure the accuracy, it needs to be recalibrated after being used period of time. By the comparison with the standard ultraviolet irradiance meters, which are traceable to national benchmarks, we can acquire the correction factor to ensure that the instruments working under accurate status and giving the accurate measured data. This leads to an important question: what kind of testing device is more accurate and reliable? This article introduces the testing method and problems of the current testing device for ultraviolet irradiance meters. In order to solve these problems, we have developed a new three-dimensional automatic testing device. We introduce structure and working principle of this system and compare the advantages and disadvantages of two devices. In addition, we analyses the errors in the testing of ultraviolet irradiance meters.
Pressure profiles of the BRing based on the simulation used in the CSRm
NASA Astrophysics Data System (ADS)
Wang, J. C.; Li, P.; Yang, J. C.; Yuan, Y. J.; Wu, B.; Chai, Z.; Luo, C.; Dong, Z. Q.; Zheng, W. H.; Zhao, H.; Ruan, S.; Wang, G.; Liu, J.; Chen, X.; Wang, K. D.; Qin, Z. M.; Yin, B.
2017-07-01
HIAF-BRing, a new multipurpose accelerator facility of the High Intensity heavy-ion Accelerator Facility project, requires an extremely high vacuum lower than 10-11 mbar to fulfill the requirements of radioactive beam physics and high energy density physics. To achieve the required process pressure, the bench-marked codes of VAKTRAK and Molflow+ are used to simulate the pressure profiles of the BRing system. In order to ensure the accuracy of the implementation of VAKTRAK, the computational results are verified by measured pressure data and compared with a new simulation code BOLIDE on the current synchrotron CSRm. Since the verification of VAKTRAK has been done, the pressure profiles of the BRing are calculated with different parameters such as conductance, out-gassing rates and pumping speeds. According to the computational results, the optimal parameters are selected to achieve the required pressure for the BRing.
Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.
Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros
2018-05-01
We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.
NASA Technical Reports Server (NTRS)
Allton, J. H.
2017-01-01
There is widespread agreement among planetary scientists that much of what we know about the workings of the solar system comes from accurate, high precision measurements on returned samples. Precision is a function of the number of atoms the instrumentation is able to count. Accuracy depends on the calibration or standardization technique. For Genesis, the solar wind sample return mission, acquiring enough atoms to ensure precise SW measurements and then accurately quantifying those measurements were steps known to be non-trivial pre-flight. The difficulty of precise and accurate measurements on returned samples, and why they cannot be made remotely, is not communicated well to the public. In part, this is be-cause "high precision" is abstract and error bars are not very exciting topics. This paper explores ideas for collecting and compiling compelling metaphors and colorful examples as a resource for planetary science public speakers.
Increasing the speed of tumour diagnosis during surgery with selective scanning Raman microscopy
NASA Astrophysics Data System (ADS)
Kong, Kenny; Rowlands, Christopher J.; Varma, Sandeep; Perkins, William; Leach, Iain H.; Koloydenko, Alexey A.; Pitiot, Alain; Williams, Hywel C.; Notingher, Ioan
2014-09-01
One of the main challenges in cancer surgery is ensuring that all tumour cells are removed during surgery, while sparing as much healthy tissue as possible. Histopathology, the gold-standard technique for cancer diagnosis, is often impractical for intra-operative use because of the time-consuming tissue preparation procedures (sectioning and staining). Raman micro-spectroscopy is a powerful technique that can discriminate between tumours and healthy tissues with high accuracy, based entirely on intrinsic chemical differences. However, raster-scanning Raman micro-spectroscopy is a slow imaging technique that typically requires data acquisition times as long as several days for typical tissue samples obtained during surgery (1 × 1 cm2) - in particular when high signal-to-noise ratio spectra are required to ensure accurate diagnosis. In this paper we present two techniques based on selective sampling Raman micro-spectroscopy that can overcome these limitations. In selective sampling, information regarding the spatial features of the tissue, either measured by an alternative optical technique or estimated in real-time from the Raman spectra, can be used to drastically reduce the number of Raman spectra required for diagnosis. These sampling strategies allowed diagnosis of basal cell carcinoma in skin tissue samples excised during Mohs micrographic surgery faster than frozen section histopathology, and two orders of magnitude faster than previous techniques based on raster-scanning Raman microscopy. Further development of these techniques may help during cancer surgery by providing a fast and objective way for surgeons to ensure the complete removal of tumour cells while sparing as much healthy tissue as possible.
Relativistic theory for picosecond time transfer in the vicinity of Earth
NASA Technical Reports Server (NTRS)
Petit, G.; Wolf, P.
1994-01-01
The problem of light propagation is treated in a geocentric reference system with the goal of ensuring picosecond accuracy for time transfer techniques using electromagnetic signals in the vicinity of the Earth. We give an explicit formula for a one way time transfer, to be applied when the spatial coordinates of the time transfer stations are known in a geocentric reference system rotating with the Earth. This expression is extended, at the same accuracy level of one picosecond, to the special cases of two way and LASSO time transfers via geostationary satellites.
Note: A three-dimensional calibration device for the confocal microscope.
Jensen, K E; Weitz, D A; Spaepen, F
2013-01-01
Modern confocal microscopes enable high-precision measurement in three dimensions by collecting stacks of 2D (x-y) images that can be assembled digitally into a 3D image. It is difficult, however, to ensure position accuracy, particularly along the optical (z) axis where scanning is performed by a different physical mechanism than in x-y. We describe a simple device to calibrate simultaneously the x, y, and z pixel-to-micrometer conversion factors for a confocal microscope. By taking a known 2D pattern and positioning it at a precise angle with respect to the microscope axes, we created a 3D reference standard. The device is straightforward to construct and easy to use.
NASA Technical Reports Server (NTRS)
Wallace, William T.; Limero, Thomas F.; Gazda, Daniel B.; Minton, John M.; Macatangay, Ariel V.; Dwivedi, Prabha; Fernandez, Facundo M.
2014-01-01
Real-time environmental monitoring on ISS is necessary to provide data in a timely fashion and to help ensure astronaut health. Current real-time water TOC monitoring provides high-quality trending information, but compound-specific data is needed. The combination of ETV with the AQM showed that compounds of interest could be liberated from water and analyzed in the same manner as air sampling. Calibration of the AQM using water samples allowed for the quantitative analysis of ISS archival samples. Some calibration issues remain, but the excellent accuracy of DMSD indicates that ETV holds promise for as a sample introduction method for water analysis in spaceflight.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaftan, V. I.; Ustinov, A. V.
The feasibility of using global radio-navigation satellite systems (GNSS) to improve functional safety of high-liability water-development works - dams at hydroelectric power plants, and, consequently, the safety of the population in the surrounding areas is examined on the basis of analysis of modern publications. Characteristics for determination of displacements and deformations with use of GNSS, and also in a complex with other types of measurements, are compared. It is demonstrated that combined monitoring of deformations of the ground surface of the region, and engineering and technical structures is required to ensure the functional safety of HPP, and reliable metrologic assurancemore » of measurements is also required to obtain actual characteristics of the accuracy and effectiveness of GNSS observations.« less
Mental workload prediction based on attentional resource allocation and information processing.
Xiao, Xu; Wanyan, Xiaoru; Zhuang, Damin
2015-01-01
Mental workload is an important component in complex human-machine systems. The limited applicability of empirical workload measures produces the need for workload modeling and prediction methods. In the present study, a mental workload prediction model is built on the basis of attentional resource allocation and information processing to ensure pilots' accuracy and speed in understanding large amounts of flight information on the cockpit display interface. Validation with an empirical study of an abnormal attitude recovery task showed that this model's prediction of mental workload highly correlated with experimental results. This mental workload prediction model provides a new tool for optimizing human factors interface design and reducing human errors.
Review on the Traction System Sensor Technology of a Rail Transit Train.
Feng, Jianghua; Xu, Junfeng; Liao, Wu; Liu, Yong
2017-06-11
The development of high-speed intelligent rail transit has increased the number of sensors applied on trains. These play an important role in train state control and monitoring. These sensors generally work in a severe environment, so the key problem for sensor data acquisition is to ensure data accuracy and reliability. In this paper, we follow the sequence of sensor signal flow, present sensor signal sensing technology, sensor data acquisition, and processing technology, as well as sensor fault diagnosis technology based on the voltage, current, speed, and temperature sensors which are commonly used in train traction systems. Finally, intelligent sensors and future research directions of rail transit train sensors are discussed.
Review on the Traction System Sensor Technology of a Rail Transit Train
Feng, Jianghua; Xu, Junfeng; Liao, Wu; Liu, Yong
2017-01-01
The development of high-speed intelligent rail transit has increased the number of sensors applied on trains. These play an important role in train state control and monitoring. These sensors generally work in a severe environment, so the key problem for sensor data acquisition is to ensure data accuracy and reliability. In this paper, we follow the sequence of sensor signal flow, present sensor signal sensing technology, sensor data acquisition, and processing technology, as well as sensor fault diagnosis technology based on the voltage, current, speed, and temperature sensors which are commonly used in train traction systems. Finally, intelligent sensors and future research directions of rail transit train sensors are discussed. PMID:28604615
NASA Astrophysics Data System (ADS)
Herceg, M.; Jørgensen, P. S.; Jørgensen, J. L.
2017-08-01
Launched into orbit on November 22, 2013, the Swarm constellation of three satellites precisely measures magnetic signal of the Earth. To ensure the high accuracy of magnetic observation by vector magnetometer (VFM), its inertial attitude is precisely determined by μASC (micro Advanced Stellar Compass). Each of the three Swarm satellites is equipped with three μASC Camera Head Units (CHU) mounted on a common optical bench (OB), which has a purpose of transference of the attitude from the star trackers to the magnetometer measurements. Although substantial pre-launch analyses were made to maximize thermal and mechanical stability of the OB, significant signal with thermal signature is discovered when comparing relative attitude between the three CHU's (Inter Boresight Angle, IBA). These misalignments between CHU's, and consequently geomagnetic reference frame, are found to be correlated with the period of angle between Swarm orbital plane and the Sun (ca. 267 days), which suggests sensitivity of optical bench system on temperature variation. In this paper, we investigate the propagation of thermal effects into the μASC attitude observations and demonstrate how thermally induced attitude variation can be predicted and corrected in the Swarm data processing. The results after applying thermal corrections show decrease in IBA RMS from 6.41 to 2.58″. The model significantly improves attitude determination which, after correction, meets the requirements of Swarm satellite mission. This study demonstrates the importance of the OB pre-launch analysis to ensure minimum thermal gradient on satellite optical system and therefore maximum attitude accuracy.
40 CFR 98.314 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...
40 CFR 98.314 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...
40 CFR 98.314 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...
40 CFR 98.314 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...
ERIC Educational Resources Information Center
Simkin, Mark G.
2008-01-01
Data-validation routines enable computer applications to test data to ensure their accuracy, completeness, and conformance to industry or proprietary standards. This paper presents five programming cases that require students to validate five different types of data: (1) simple user data entries, (2) UPC codes, (3) passwords, (4) ISBN numbers, and…
Automatic Flushing Unit With Cleanliness Monitor
NASA Technical Reports Server (NTRS)
Hildebrandt, N. E.
1982-01-01
Liquid-level probe kept clean, therefore at peak accuracy, by unit that flushes probe with solvent, monitors effluent for contamination, and determines probe is particle-free. Approach may be adaptable to industrial cleaning such as flushing filters and pipes, and ensuring that manufactured parts have been adequately cleaned.
Comparison of Varied Precipitation and Soil Data Types for Use in Watershed Modeling.
The accuracy of water quality and quantity models depends on calibration to ensure reliable simulations of streamflow, which in turn requires accurate climatic forcing data. Precipitation is widely acknowledged to be the largest source of uncertainty in watershed modeling, and so...
Emission estimates are important for ensuring the accuracy of atmospheric chemical transport models. Estimates of biogenic and wildland fire emissions, because of their sensitivity to meteorological conditions, need to be carefully constructed and closely linked with a meteorolo...
76 FR 70532 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... comments received will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it... correspondence submitted. FRA will summarize comments received in response to this notice in a subsequent notice... functions, including whether the activities will have practical utility; (ii) the accuracy of FRA's...
78 FR 76190 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-16
... received will advance three objectives: (i) reduce reporting burdens; (ii) ensure that it organizes.... FRA will summarize comments received in response to this notice in a subsequent notice and include... functions, including whether the activities will have practical utility; (ii) the accuracy of FRA's...
75 FR 3275 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-20
... comments received will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it... submitted. FRA will summarize comments received in response to this notice in a subsequent notice and... functions, including whether the activities will have practical utility; (ii) the accuracy of FRA's...
78 FR 59086 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... comments received will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it... correspondence submitted. FRA will summarize comments received in response to this notice in a subsequent notice... functions, including whether the activities will have practical utility; (ii) the accuracy of FRA's...
Boer, Kimberly R.; Dyserinck, Heleen C.; Büscher, Philippe; Schallig, Henk D. H. F.; Leeflang, Mariska M. G.
2012-01-01
Background A range of molecular amplification techniques have been developed for the diagnosis of Human African Trypanosomiasis (HAT); however, careful evaluation of these tests must precede implementation to ensure their high clinical accuracy. Here, we investigated the diagnostic accuracy of molecular amplification tests for HAT, the quality of articles and reasons for variation in accuracy. Methodology Data from studies assessing diagnostic molecular amplification tests were extracted and pooled to calculate accuracy. Articles were included if they reported sensitivity and specificity or data whereby values could be calculated. Study quality was assessed using QUADAS and selected studies were analysed using the bivariate random effects model. Results 16 articles evaluating molecular amplification tests fulfilled the inclusion criteria: PCR (n = 12), NASBA (n = 2), LAMP (n = 1) and a study comparing PCR and NASBA (n = 1). Fourteen articles, including 19 different studies were included in the meta-analysis. Summary sensitivity for PCR on blood was 99.0% (95% CI 92.8 to 99.9) and the specificity was 97.7% (95% CI 93.0 to 99.3). Differences in study design and readout method did not significantly change estimates although use of satellite DNA as a target significantly lowers specificity. Sensitivity and specificity of PCR on CSF for staging varied from 87.6% to 100%, and 55.6% to 82.9% respectively. Conclusion Here, PCR seems to have sufficient accuracy to replace microscopy where facilities allow, although this conclusion is based on multiple reference standards and a patient population that was not always representative. Future studies should, therefore, include patients for which PCR may become the test of choice and consider well designed diagnostic accuracy studies to provide extra evidence on the value of PCR in practice. Another use of PCR for control of disease could be to screen samples collected from rural areas and test in reference laboratories, to spot epidemics quickly and direct resources appropriately. PMID:22253934
Mugasa, Claire M; Adams, Emily R; Boer, Kimberly R; Dyserinck, Heleen C; Büscher, Philippe; Schallig, Henk D H F; Leeflang, Mariska M G
2012-01-01
A range of molecular amplification techniques have been developed for the diagnosis of Human African Trypanosomiasis (HAT); however, careful evaluation of these tests must precede implementation to ensure their high clinical accuracy. Here, we investigated the diagnostic accuracy of molecular amplification tests for HAT, the quality of articles and reasons for variation in accuracy. Data from studies assessing diagnostic molecular amplification tests were extracted and pooled to calculate accuracy. Articles were included if they reported sensitivity and specificity or data whereby values could be calculated. Study quality was assessed using QUADAS and selected studies were analysed using the bivariate random effects model. 16 articles evaluating molecular amplification tests fulfilled the inclusion criteria: PCR (n = 12), NASBA (n = 2), LAMP (n = 1) and a study comparing PCR and NASBA (n = 1). Fourteen articles, including 19 different studies were included in the meta-analysis. Summary sensitivity for PCR on blood was 99.0% (95% CI 92.8 to 99.9) and the specificity was 97.7% (95% CI 93.0 to 99.3). Differences in study design and readout method did not significantly change estimates although use of satellite DNA as a target significantly lowers specificity. Sensitivity and specificity of PCR on CSF for staging varied from 87.6% to 100%, and 55.6% to 82.9% respectively. Here, PCR seems to have sufficient accuracy to replace microscopy where facilities allow, although this conclusion is based on multiple reference standards and a patient population that was not always representative. Future studies should, therefore, include patients for which PCR may become the test of choice and consider well designed diagnostic accuracy studies to provide extra evidence on the value of PCR in practice. Another use of PCR for control of disease could be to screen samples collected from rural areas and test in reference laboratories, to spot epidemics quickly and direct resources appropriately.
[Clinical trial data management and quality metrics system].
Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan
2015-11-01
Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.
NASA Astrophysics Data System (ADS)
Kosiel, Kamil; Koba, Marcin; Masiewicz, Marcin; Śmietana, Mateusz
2018-06-01
The paper shows application of atomic layer deposition (ALD) technique as a tool for tailoring sensorial properties of lossy-mode-resonance (LMR)-based optical fiber sensors. Hafnium dioxide (HfO2), zirconium dioxide (ZrO2), and tantalum oxide (TaxOy), as high-refractive-index dielectrics that are particularly convenient for LMR-sensor fabrication, were deposited by low-temperature (100 °C) ALD ensuring safe conditions for thermally vulnerable fibers. Applicability of HfO2 and ZrO2 overlays, deposited with ALD-related atomic level thickness accuracy for fabrication of LMR-sensors with controlled sensorial properties was presented. Additionally, for the first time according to our best knowledge, the double-layer overlay composed of two different materials - silicon nitride (SixNy) and TaxOy - is presented for the LMR fiber sensors. The thin films of such overlay were deposited by two different techniques - PECVD (the SixNy) and ALD (the TaxOy). Such approach ensures fast overlay fabrication and at the same time facility for resonant wavelength tuning, yielding devices with satisfactory sensorial properties.
Accuracy of electromyography needle placement in cadavers: non-guided vs. ultrasound guided.
Boon, Andrea J; Oney-Marlow, Theresa M; Murthy, Naveen S; Harper, Charles M; McNamara, Terrence R; Smith, Jay
2011-07-01
Accuracy of needle electromyography is typically ensured by use of anatomical landmarks and auditory feedback related to voluntary activation of the targeted muscle; however, in certain clinical situations, landmarks may not be palpable, auditory feedback may be limited or not present, and targeting a specific muscle may be more critical. In such settings, image guidance might significantly enhance accuracy. Two electromyographers with different levels of experience examined 14 muscles in each of 4 fresh-frozen cadaver lower limbs. Each muscle was tested a total of eight times; four fine wires were inserted without ultrasound (US) guidance and four were inserted under US guidance. Overall accuracy as well as accuracy rates for the individual electromyographers were calculated. Non-guided needle placement was significantly less accurate than US-guided needle placement, particularly in the hands of less experienced electromyographers, supporting the use of real-time US guidance in certain challenging situations in the electromyography laboratory. Copyright © 2011 Wiley Periodicals, Inc.
Blower, Sally; Go, Myong-Hyun
2011-07-19
Mathematical models are useful tools for understanding and predicting epidemics. A recent innovative modeling study by Stehle and colleagues addressed the issue of how complex models need to be to ensure accuracy. The authors collected data on face-to-face contacts during a two-day conference. They then constructed a series of dynamic social contact networks, each of which was used to model an epidemic generated by a fast-spreading airborne pathogen. Intriguingly, Stehle and colleagues found that increasing model complexity did not always increase accuracy. Specifically, the most detailed contact network and a simplified version of this network generated very similar results. These results are extremely interesting and require further exploration to determine their generalizability.
Strain gage based determination of mixed mode SIFs
NASA Astrophysics Data System (ADS)
Murthy, K. S. R. K.; Sarangi, H.; Chakraborty, D.
2018-05-01
Accurate determination of mixed mode stress intensity factors (SIFs) is essential in understanding and analysis of mixed mode fracture of engineering components. Only a few strain gage determination of mixed mode SIFs are reported in literatures and those also do not provide any prescription for radial locations of strain gages to ensure accuracy of measurement. The present investigation experimentally demonstrates the efficacy of a proposed methodology for the accurate determination of mixed mode I/II SIFs using strain gages. The proposed approach is based on the modified Dally and Berger's mixed mode technique. Using the proposed methodology appropriate gage locations (optimal locations) for a given configuration have also been suggested ensuring accurate determination of mixed mode SIFs. Experiments have been conducted by locating the gages at optimal and non-optimal locations to study the efficacy of the proposed approach. The experimental results from the present investigation show that highly accurate SIFs (0.064%) can be determined using the proposed approach if the gages are located at the suggested optimal locations. On the other hand, results also show the very high errors (212.22%) in measured SIFs possible if the gages are located at non-optimal locations. The present work thus clearly substantiates the importance of knowing the optimal locations of the strain gages apriori in accurate determination of SIFs.
STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration
Cohen, Jérémie F; Korevaar, Daniël A; Altman, Douglas G; Bruns, David E; Gatsonis, Constantine A; Hooft, Lotty; Irwig, Les; Levine, Deborah; Reitsma, Johannes B; de Vet, Henrica C W; Bossuyt, Patrick M M
2016-01-01
Diagnostic accuracy studies are, like other clinical studies, at risk of bias due to shortcomings in design and conduct, and the results of a diagnostic accuracy study may not apply to other patient groups and settings. Readers of study reports need to be informed about study design and conduct, in sufficient detail to judge the trustworthiness and applicability of the study findings. The STARD statement (Standards for Reporting of Diagnostic Accuracy Studies) was developed to improve the completeness and transparency of reports of diagnostic accuracy studies. STARD contains a list of essential items that can be used as a checklist, by authors, reviewers and other readers, to ensure that a report of a diagnostic accuracy study contains the necessary information. STARD was recently updated. All updated STARD materials, including the checklist, are available at http://www.equator-network.org/reporting-guidelines/stard. Here, we present the STARD 2015 explanation and elaboration document. Through commented examples of appropriate reporting, we clarify the rationale for each of the 30 items on the STARD 2015 checklist, and describe what is expected from authors in developing sufficiently informative study reports. PMID:28137831
Farzandipour, Mehrdad; Sheikhtaheri, Abbas
2009-01-01
To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647
Innovative monitoring of 3D warp interlock fabric during forming process
NASA Astrophysics Data System (ADS)
Dufour, C.; Jerkovic, I.; Wang, P.; Boussu, F.; Koncar, V.; Soulat, D.; Grancaric, A. M.; Pineau, P.
2017-10-01
The final geometry of 3D warp interlock fabric needs to be check during the 3D forming step to ensure the right locations of warp and weft yarns inside the final structure. Thus, a new monitoring approach has been proposed based on sensor yarns located in the fabric thickness. To ensure the accuracy of measurements, the observation of the surface deformation of the 3D warp interlock fabric has been joined to the sensor yarns measurements. At the end, it has been revealed a good correlation between strain measurement done globally by camera and locally performed by sensor yarns.
O'Brien, M J; Takahashi, M; Brugal, G; Christen, H; Gahm, T; Goodell, R M; Karakitsos, P; Knesel, E A; Kobler, T; Kyrkou, K A; Labbe, S; Long, E L; Mango, L J; McGoogan, E; Oberholzer, M; Reith, A; Winkler, C
1998-01-01
Optical digital imaging and its related technologies have applications in cytopathology that encompass training and education, image analysis, diagnosis, report documentation and archiving, and telecommunications. Telecytology involves the use of telecommunications to transmit cytology images for the purposes of diagnosis, consultation or education. This working paper provides a mainly informational overview of optical digital imaging and summarizes current technologic resources and applications and some of the ethical and legal implications of the use of these new technologies in cytopathology. Computer hardware standards for optical digital imagery will continue to be driven mainly by commercial interests and nonmedical imperatives, but professional organizations can play a valuable role in developing recommendations or standards for digital image sampling, documentation, archiving, authenticity safeguards and teleconsultation protocols; in addressing patient confidentiality and ethical, legal and informed consent issues; and in providing support for quality assurance and standardization of digital image-based testing. There is some evidence that high levels of accuracy for telepathology diagnosis can be achieved using existing dynamic systems, which may also be applicable to telecytology consultation. Static systems for both telepathology and telecytology, which have the advantage of considerably lower cost, appear to have lower levels of accuracy. Laboratories that maintain digital image databases should adopt practices and protocols that ensure patient confidentiality. Individuals participating in telecommunication of digital images for diagnosis should be properly qualified, meet licensing requirements and use procedures that protect patient confidentiality. Such individuals should be cognizant of the limitations of the technology and employ quality assurance practices that ensure the validity and accuracy of each consultation. Even in an informal teleconsultation setting one should define the extent of participation and be mindful of potential malpractice liability. Digital imagery applications will continue to present new opportunities and challenges. Position papers such as this are directed toward assisting the profession to stay informed and in control of these applications in the laboratory. Telecytology is an area in particular need of studies of good quality to provide data on factors affecting accuracy. New technologic approaches to addressing the issue of selective sampling in static image consultation are needed. The use of artificial intelligence software as an adjunct to enhance the accuracy and reproducibility of cytologic diagnosis of digital images in routine and consultation settings deserves to be pursued. Other telecytology-related issues that require clarification and the adoption of workable guidelines include interstate licensure and protocols to define malpractice liability.
A laboratory assessment of the measurement accuracy of weighing type rainfall intensity gauges
NASA Astrophysics Data System (ADS)
Colli, M.; Chan, P. W.; Lanza, L. G.; La Barbera, P.
2012-04-01
In recent years the WMO Commission for Instruments and Methods of Observation (CIMO) fostered noticeable advancements in the accuracy of precipitation measurement issue by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries (Lanza et al., 2005; Vuerich et al., 2009). Extreme events analysis is proven to be highly affected by the on-site RI measurement accuracy (see e.g. Molini et al., 2004) and the time resolution of the available RI series certainly constitutes another key-factor in constructing hyetographs that are representative of real rain events. The OTT Pluvio2 weighing gauge (WG) and the GEONOR T-200 vibrating-wire precipitation gauge demonstrated very good performance under previous constant flow rate calibration efforts (Lanza et al., 2005). Although WGs do provide better performance than more traditional Tipping Bucket Rain gauges (TBR) under continuous and constant reference intensity, dynamic effects seem to affect the accuracy of WG measurements under real world/time varying rainfall conditions (Vuerich et al., 2009). The most relevant is due to the response time of the acquisition system and the derived systematic delay of the instrument in assessing the exact weight of the bin containing cumulated precipitation. This delay assumes a relevant role in case high resolution rain intensity time series are sought from the instrument, as is the case of many hydrologic and meteo-climatic applications. This work reports the laboratory evaluation of Pluvio2 and T-200 rainfall intensity measurements accuracy. Tests are carried out by simulating different artificial precipitation events, namely non-stationary rainfall intensity, using a highly accurate dynamic rainfall generator. Time series measured by an Ogawa drop counter (DC) at a field test site located within the Hong Kong International Airport (HKIA) were aggregated at a 1-minute scale and used as reference for the artificial rain generation (Colli et al., 2012). The preliminary development and validation of the rainfall simulator for the generation of variable time steps reference intensities is also shown. The generator is characterized by a sufficiently short time response with respect to the expected weighing gauges behavior in order to ensure effective comparison of the measured/reference intensity at very high resolution in time.
High-Throughput Platform for Synthesis of Melamine-Formaldehyde Microcapsules.
Çakir, Seda; Bauters, Erwin; Rivero, Guadalupe; Parasote, Tom; Paul, Johan; Du Prez, Filip E
2017-07-10
The synthesis of microcapsules via in situ polymerization is a labor-intensive and time-consuming process, where many composition and process factors affect the microcapsule formation and its morphology. Herein, we report a novel combinatorial technique for the preparation of melamine-formaldehyde microcapsules, using a custom-made and automated high-throughput platform (HTP). After performing validation experiments for ensuring the accuracy and reproducibility of the novel platform, a design of experiment study was performed. The influence of different encapsulation parameters was investigated, such as the effect of the surfactant, surfactant type, surfactant concentration and core/shell ratio. As a result, this HTP-platform is suitable to be used for the synthesis of different types of microcapsules in an automated and controlled way, allowing the screening of different reaction parameters in a shorter time compared to the manual synthetic techniques.
NASA Astrophysics Data System (ADS)
Talib, Imran; Belgacem, Fethi Bin Muhammad; Asif, Naseer Ahmad; Khalil, Hammad
2017-01-01
In this research article, we derive and analyze an efficient spectral method based on the operational matrices of three dimensional orthogonal Jacobi polynomials to solve numerically the mixed partial derivatives type multi-terms high dimensions generalized class of fractional order partial differential equations. We transform the considered fractional order problem to an easily solvable algebraic equations with the aid of the operational matrices. Being easily solvable, the associated algebraic system leads to finding the solution of the problem. Some test problems are considered to confirm the accuracy and validity of the proposed numerical method. The convergence of the method is ensured by comparing our Matlab software simulations based obtained results with the exact solutions in the literature, yielding negligible errors. Moreover, comparative results discussed in the literature are extended and improved in this study.
NASA Astrophysics Data System (ADS)
Wang, Yue; Yu, Jingjun; Pei, Xu
2018-06-01
A new forward kinematics algorithm for the mechanism of 3-RPS (R: Revolute; P: Prismatic; S: Spherical) parallel manipulators is proposed in this study. This algorithm is primarily based on the special geometric conditions of the 3-RPS parallel mechanism, and it eliminates the errors produced by parasitic motions to improve and ensure accuracy. Specifically, the errors can be less than 10-6. In this method, only the group of solutions that is consistent with the actual situation of the platform is obtained rapidly. This algorithm substantially improves calculation efficiency because the selected initial values are reasonable, and all the formulas in the calculation are analytical. This novel forward kinematics algorithm is well suited for real-time and high-precision control of the 3-RPS parallel mechanism.
Calculating the Financial Impact of Population Growth on Education.
ERIC Educational Resources Information Center
Cline, Daniel H.
It is particularly difficult to make accurate enrollment projections for areas that are experiencing a rapid expansion in their population. The traditional method of calculating cohort survival ratios must be modified and supplemented with additional information to ensure accuracy; cost projection methods require detailed analyses of current costs…
27 CFR 19.276 - Package scales.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Package scales. 19.276... Package scales. Proprietors shall ensure the accuracy of scales used for weighing packages of spirits through tests conducted at intervals of not more than 6 months or whenever scales are adjusted or repaired...
Code of Federal Regulations, 2011 CFR
2011-07-01
... the COPS MPRS and a systems administrator to ensure that the system is properly functioning. Reporting... System (DIBRS). The Army inputs its data into DIBRS utilizing COPS. Any data reported to DIBRS is only as good as the data reported into COPS, so the need for accuracy in reporting incidents and utilizing...
Code of Federal Regulations, 2014 CFR
2014-07-01
... the COPS MPRS and a systems administrator to ensure that the system is properly functioning. Reporting... System (DIBRS). The Army inputs its data into DIBRS utilizing COPS. Any data reported to DIBRS is only as good as the data reported into COPS, so the need for accuracy in reporting incidents and utilizing...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the COPS MPRS and a systems administrator to ensure that the system is properly functioning. Reporting... System (DIBRS). The Army inputs its data into DIBRS utilizing COPS. Any data reported to DIBRS is only as good as the data reported into COPS, so the need for accuracy in reporting incidents and utilizing...
Code of Federal Regulations, 2013 CFR
2013-07-01
... the COPS MPRS and a systems administrator to ensure that the system is properly functioning. Reporting... System (DIBRS). The Army inputs its data into DIBRS utilizing COPS. Any data reported to DIBRS is only as good as the data reported into COPS, so the need for accuracy in reporting incidents and utilizing...
7 CFR 210.9 - Agreement with State agency.
Code of Federal Regulations, 2011 CFR
2011-01-01
... eligible for such meals under 7 CFR part 245; (8) Claim reimbursement at the assigned rates only for... analyzing meal counts to ensure accuracy as specified in § 210.8 governing claims for reimbursement... free, reduced price and paid reimbursable meals served to eligible children at the point of service, or...
75 FR 70604 - Wireless E911 Location Accuracy Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-18
... carriers are unable to recover the substantial cost of constructing a large number of additional cell sites... characteristics, cell site density, overall system technology requirements, etc.) while, in either case, ensuring... the satellites and the handset. The more extensive the tree cover, the greater the difficulty the...
It's a People Thing: Demystifying College Information.
ERIC Educational Resources Information Center
Owen, Jane
This booklet, which is intended for United Kingdom further education (FE) college staff at all levels, illustrates ways FE colleges have used information technology (IT) to manage their college information and ensure its accuracy. Section 1 provides an overview of the information-related problems encountered by FE colleges and summarizes key…
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly petroleum coke consumption measurements. (c) For CO2 process... quality assurance and quality control of the supplier data, you must conduct an annual measurement of the...
40 CFR 98.57 - Records that must be retained.
Code of Federal Regulations, 2010 CFR
2010-07-01
... calendar year. (d) Documentation of how accounting procedures were used to estimate production rate. (e...) Performance test reports of N2O emissions. (g) Measurements, records and calculations used to determine reported parameters. (h) Documentation of the procedures used to ensure the accuracy of the measurements of...
SOFIA: an R package for enhancing genetic visualization with Circos
USDA-ARS?s Scientific Manuscript database
Visualization of data from any stage of genetic and genomic research is one of the most useful approaches for detecting potential errors, ensuring accuracy and reproducibility, and presentation of the resulting data. Currently software such as Circos, ClicO FS, and RCircos, among others, provide too...
3 CFR - Enhancing Payment Accuracy Through a “Do Not Pay List”
Code of Federal Regulations, 2011 CFR
2011-01-01
... are not made. Agencies maintain many databases containing information on a recipient's eligibility to... databases before making payments or awards, agencies can identify ineligible recipients and prevent certain... pre-payment and pre-award procedures and ensure that a thorough review of available databases with...
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
12 CFR 1235.4 - Minimum requirements of a record retention program.
Code of Federal Regulations, 2014 CFR
2014-01-01
... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...
12 CFR 1235.4 - Minimum requirements of a record retention program.
Code of Federal Regulations, 2012 CFR
2012-01-01
... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...
12 CFR 1235.4 - Minimum requirements of a record retention program.
Code of Federal Regulations, 2013 CFR
2013-01-01
... appropriate to support administrative, business, external and internal audit functions, and litigation of the... for appropriate back-up and recovery of electronic records to ensure the same accuracy as the primary... records, preferably searchable, must be maintained on immutable, non-rewritable storage in a manner that...
76 FR 52581 - Automated Data Processing and Information Retrieval System Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-23
..., however, conduct pre and/or post implementation reviews. These reviews would be intended to: Evaluate... payment accuracy. This proposed rule would also specify the requirements for submission of a test plan... eligibility systems are adequately reviewed and tested. The law requires accountability for ensuring test...
76 FR 63990 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it organizes information... will summarize comments received in response to this notice in a subsequent notice and include them in..., including whether the activities will have practical utility; (ii) the accuracy of FRA's estimates of the...
75 FR 32981 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... will advance three objectives: (i) Reduce reporting burdens; (ii) ensure that it organizes information.... FRA will summarize comments received in response to this notice in a subsequent notice and include..., including whether the activities will have practical utility; (ii) the accuracy of FRA's estimates of the...
NASA Astrophysics Data System (ADS)
Sheng, Yicheng; Jin, Weiqi; Dun, Xiong; Zhou, Feng; Xiao, Si
2017-10-01
With the demand of quantitative remote sensing technology growing, high reliability as well as high accuracy radiometric calibration technology, especially the on-orbit radiometric calibration device has become an essential orientation in term of quantitative remote sensing technology. In recent years, global launches of remote sensing satellites are equipped with innovative on-orbit radiometric calibration devices. In order to meet the requirements of covering a very wide dynamic range and no-shielding radiometric calibration system, we designed a projection-type radiometric calibration device for high dynamic range sensors based on the Schmidt telescope system. In this internal radiometric calibration device, we select the EF-8530 light source as the calibration blackbody. EF-8530 is a high emittance Nichrome (Ni-Cr) reference source. It can operate in steady or pulsed state mode at a peak temperature of 973K. The irradiance from the source was projected to the IRFPA. The irradiance needs to ensure that the IRFPA can obtain different amplitude of the uniform irradiance through the narrow IR passbands and cover the very wide dynamic range. Combining the internal on-orbit radiometric calibration device with the specially designed adaptive radiometric calibration algorithms, an on-orbit dynamic non-uniformity correction can be accomplished without blocking the optical beam from outside the telescope. The design optimizes optics, source design, and power supply electronics for irradiance accuracy and uniformity. The internal on-orbit radiometric calibration device not only satisfies a series of indexes such as stability, accuracy, large dynamic range and uniformity of irradiance, but also has the advantages of short heating and cooling time, small volume, lightweight, low power consumption and many other features. It can realize the fast and efficient relative radiometric calibration without shielding the field of view. The device can applied to the design and manufacture of the scanning infrared imaging system, the infrared remote sensing system, the infrared early-warning satellite, and so on.
High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis
NASA Astrophysics Data System (ADS)
MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.
2012-01-01
Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.
NASA Astrophysics Data System (ADS)
Ganguli, Anurag; Saha, Bhaskar; Raghavan, Ajay; Kiesel, Peter; Arakaki, Kyle; Schuh, Andreas; Schwartz, Julian; Hegyi, Alex; Sommer, Lars Wilko; Lochbaum, Alexander; Sahu, Saroj; Alamgir, Mohamed
2017-02-01
A key challenge hindering the mass adoption of Lithium-ion and other next-gen chemistries in advanced battery applications such as hybrid/electric vehicles (xEVs) has been management of their functional performance for more effective battery utilization and control over their life. Contemporary battery management systems (BMS) reliant on monitoring external parameters such as voltage and current to ensure safe battery operation with the required performance usually result in overdesign and inefficient use of capacity. More informative embedded sensors are desirable for internal cell state monitoring, which could provide accurate state-of-charge (SOC) and state-of-health (SOH) estimates and early failure indicators. Here we present a promising new embedded sensing option developed by our team for cell monitoring, fiber-optic (FO) sensors. High-performance large-format pouch cells with embedded FO sensors were fabricated. This second part of the paper focuses on the internal signals obtained from these FO sensors. The details of the method to isolate intercalation strain and temperature signals are discussed. Data collected under various xEV operational conditions are presented. An algorithm employing dynamic time warping and Kalman filtering was used to estimate state-of-charge with high accuracy from these internal FO signals. Their utility for high-accuracy, predictive state-of-health estimation is also explored.
Laser light-section sensor automating the production of textile-reinforced composites
NASA Astrophysics Data System (ADS)
Schmitt, R.; Niggemann, C.; Mersmann, C.
2009-05-01
Due to their advanced weight-specific mechanical properties, the application of fibre-reinforced plastics (FRP) has been established as a key technology in several engineering areas. Textile-based reinforcement structures (Preform) in particular achieve a high structural integrity due to the multi-dimensional build-up of dry-fibre layers combined with 3D-sewing and further textile processes. The final composite parts provide enhanced damage tolerances through excellent crash-energy absorbing characteristics. For these reasons, structural parts (e.g. frame) will be integrated in next generation airplanes. However, many manufacturing processes for FRP are still involving manual production steps without integrated quality control. The non-automated production implies considerable process dispersion and a high rework rate. Before the final inspection there is no reliable information about the production status. This work sets metrology as the key to automation and thus an economically feasible production, applying a laser light-section sensor system (LLSS) to measure process quality and feed back the results to close control loops of the production system. The developed method derives 3D-measurements from height profiles acquired by the LLSS. To assure the textile's quality a full surface scan is conducted, detecting defects or misalignment by comparing the measurement results with a CAD model of the lay-up. The method focuses on signal processing of the height profiles to ensure a sub-pixel accuracy using a novel algorithm based on a non-linear least-square fitting to a set of sigmoid functions. To compare the measured surface points to the CAD model, material characteristics are incorporated into the method. This ensures that only the fibre layer of the textile's surface is included and gaps between the fibres or overlaying seams are neglected. Finally, determining the uncertainty in measurement according to the GUM-standard proofed the sensor system's accuracy. First tests under industrial conditions showed that applying this sensor after the drapery of each textile layer reduces the scrap quota by approximately 30%.
Integrated photovoltaic (PV) monitoring system
NASA Astrophysics Data System (ADS)
Mahinder Singh, Balbir Singh; Husain, NurSyahidah; Mohamed, Norani Muti
2012-09-01
The main aim of this research work is to design an accurate and reliable monitoring system to be integrated with solar electricity generating system. The performance monitoring system is required to ensure that the PVEGS is operating at an optimum level. The PV monitoring system is able to measure all the important parameters that determine an optimum performance. The measured values are recorded continuously, as the data acquisition system is connected to a computer, and data is stored at fixed intervals. The data can be locally used and can also be transmitted via internet. The data that appears directly on the local monitoring system is displayed via graphical user interface that was created by using Visual basic and Apache software was used for data transmission The accuracy and reliability of the developed monitoring system was tested against the data that captured simultaneously by using a standard power quality analyzer device. The high correlation which is 97% values indicates the level of accuracy of the monitoring system. The aim of leveraging on a system for continuous monitoring system is achieved, both locally, and can be viewed simultaneously at a remote system.
Reliability of the Walker Cranial Nonmetric Method and Implications for Sex Estimation.
Lewis, Cheyenne J; Garvin, Heather M
2016-05-01
The cranial trait scoring method presented in Buikstra and Ubelaker (Standards for data collection from human skeletal remains. Fayetteville, AR: Arkansas Archeological Survey Research Series No. 44, 1994) and Walker (Am J Phys Anthropol, 136, 2008 and 39) is the most common nonmetric cranial sex estimation method utilized by physical and forensic anthropologists. As such, the reliability and accuracy of the method is vital to ensure its validity in forensic applications. In this study, inter- and intra-observer error rates for the Walker scoring method were calculated using a sample of U.S. White and Black individuals (n = 135). Cohen's weighted kappas, intraclass correlation coefficients, and percentage agreements indicate good agreement between trials and observers for all traits except the mental eminence. Slight disagreement in scoring, however, was found to impact sex classifications, leading to lower accuracy rates than those published by Walker. Furthermore, experience does appear to impact trait scoring and sex classification. The use of revised population-specific equations that avoid the mental eminence is highly recommended to minimize the potential for misclassifications. © 2016 American Academy of Forensic Sciences.
Head rice rate measurement based on concave point matching
Yao, Yuan; Wu, Wei; Yang, Tianle; Liu, Tao; Chen, Wen; Chen, Chen; Li, Rui; Zhou, Tong; Sun, Chengming; Zhou, Yue; Li, Xinlu
2017-01-01
Head rice rate is an important factor affecting rice quality. In this study, an inflection point detection-based technology was applied to measure the head rice rate by combining a vibrator and a conveyor belt for bulk grain image acquisition. The edge center mode proportion method (ECMP) was applied for concave points matching in which concave matching and separation was performed with collaborative constraint conditions followed by rice length calculation with a minimum enclosing rectangle (MER) to identify the head rice. Finally, the head rice rate was calculated using the sum area of head rice to the overall coverage of rice. Results showed that bulk grain image acquisition can be realized with test equipment, and the accuracy rate of separation of both indica rice and japonica rice exceeded 95%. An increase in the number of rice did not significantly affect ECMP and MER. High accuracy can be ensured with MER to calculate head rice rate by narrowing down its relative error between real values less than 3%. The test results show that the method is reliable as a reference for head rice rate calculation studies. PMID:28128315
Reliability of system for precise cold forging
NASA Astrophysics Data System (ADS)
Krušič, Vid; Rodič, Tomaž
2017-07-01
The influence of scatter of principal input parameters of the forging system on the dimensional accuracy of product and on the tool life for closed-die forging process is presented in this paper. Scatter of the essential input parameters for the closed-die upsetting process was adjusted to the maximal values that enabled the reliable production of a dimensionally accurate product at optimal tool life. An operating window was created in which exists the maximal scatter of principal input parameters for the closed-die upsetting process that still ensures the desired dimensional accuracy of the product and the optimal tool life. Application of the adjustment of the process input parameters is shown on the example of making an inner race of homokinetic joint from mass production. High productivity in manufacture of elements by cold massive extrusion is often achieved by multiple forming operations that are performed simultaneously on the same press. By redesigning the time sequences of forming operations at multistage forming process of starter barrel during the working stroke the course of the resultant force is optimized.
Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru
2014-10-15
Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
A method for soil moisture probes calibration and validation of satellite estimates.
Holzman, Mauro; Rivas, Raúl; Carmona, Facundo; Niclòs, Raquel
2017-01-01
Optimization of field techniques is crucial to ensure high quality soil moisture data. The aim of the work is to present a sampling method for undisturbed soil and soil water content to calibrated soil moisture probes, in a context of the SMOS (Soil Moisture and Ocean Salinity) mission MIRAS Level 2 soil moisture product validation in Pampean Region of Argentina. The method avoids soil alteration and is recommended to calibrated probes based on soil type under a freely drying process at ambient temperature. A detailed explanation of field and laboratory procedures to obtain reference soil moisture is shown. The calibration results reflected accurate operation for the Delta-T thetaProbe ML2x probes in most of analyzed cases (RMSE and bias ≤ 0.05 m 3 /m 3 ). Post-calibration results indicated that the accuracy improves significantly applying the adjustments of the calibration based on soil types (RMSE ≤ 0.022 m 3 /m 3 , bias ≤ -0.010 m 3 /m 3 ). •A sampling method that provides high quality data of soil water content for calibration of probes is described.•Importance of calibration based on soil types.•A calibration process for similar soil types could be suitable in practical terms, depending on the required accuracy level.
Research on a high-precision calibration method for tunable lasers
NASA Astrophysics Data System (ADS)
Xiang, Na; Li, Zhengying; Gui, Xin; Wang, Fan; Hou, Yarong; Wang, Honghai
2018-03-01
Tunable lasers are widely used in the field of optical fiber sensing, but nonlinear tuning exists even for zero external disturbance and limits the accuracy of the demodulation. In this paper, a high-precision calibration method for tunable lasers is proposed. A comb filter is introduced and the real-time output wavelength and scanning rate of the laser are calibrated by linear fitting several time-frequency reference points obtained from it, while the beat signal generated by the auxiliary interferometer is interpolated and frequency multiplied to find more accurate zero crossing points, with these points being used as wavelength counters to resample the comb signal to correct the nonlinear effect, which ensures that the time-frequency reference points of the comb filter are linear. A stability experiment and a strain sensing experiment verify the calibration precision of this method. The experimental result shows that the stability and wavelength resolution of the FBG demodulation can reach 0.088 pm and 0.030 pm, respectively, using a tunable laser calibrated by the proposed method. We have also compared the demodulation accuracy in the presence or absence of the comb filter, with the result showing that the introduction of the comb filter results to a 15-fold wavelength resolution enhancement.
High-precision measurement of magnetic penetration depth in superconducting films
He, X.; Gozar, A.; Sundling, R.; ...
2016-11-01
We report that the magnetic penetration depth (λ) in thin superconducting films is usually measured by the mutual inductance technique. The accuracy of this method has been limited by uncertainties in the geometry of the solenoids and in the film position and thickness, by parasitic coupling between the coils, etc. Here, we present several improvements in the apparatus and the method. To ensure the precise thickness of the superconducting layer, we engineer the films at atomic level using atomic-layer-by-layer molecular beam epitaxy. In this way, we also eliminate secondary-phase precipitates, grain boundaries, and pinholes that are common with other depositionmore » methods and that artificially increase the field transmission and thus the apparent λ. For better reproducibility, the thermal stability of our closed-cycle cryocooler used to control the temperature of the mutual inductance measurement has been significantly improved by inserting a custom-built thermal conductivity damper. Next, to minimize the uncertainties in the geometry, we fused a pair of small yet precisely wound coils into a single sapphire block machined to a high precision. Lastly, the sample is spring-loaded to exactly the same position with respect to the solenoids. Altogether, we can measure the absolute value of λ with the accuracy better than ±1%.« less
A Study on Performance and Safety Tests of Defibrillator Equipment.
Tavakoli Golpaygani, A; Movahedi, M M; Reza, M
2017-12-01
Nowadays, more than 10,000 different types of medical devices can be found in hospitals. This way, medical electrical equipment is being employed in a wide variety of fields in medical sciences with different physiological effects and measurements. Hospitals and medical centers must ensure that their critical medical devices are safe, accurate, reliable and operational at the required level of performance. Defibrillators are critical resuscitation devices. The use of reliable defibirillators has led to more effective treatments and improved patient safety through better control and management of complications during Cardiopulmonary Resuscitation (CPR). The metrological reliability of twenty frequent use, manual defibrillators in use ten hospitals (4 private and 6 public) in one of the provinces of Iran according to international and national standards was evaluated. Quantitative analysis of control and instrument accuracy showed the amount of the obtained results in many units are critical which had less value over the standard limitations especially in devices with poor battery. For the accuracy of delivered energy analysis, only twelve units delivered acceptable output values and the precision in the output energy measurements especialy in weak battry condition, after activation of discharge alarm, were low. Obtained results indicate a need for new and severe regulations on periodic performance verifications and medical equipment quality control program especially for high risk instruments. It is also necessary to provide training courses on the fundumentals of operation and performane parameters for medical staff in the field of meterology in medicine and how one can get good accuracy results especially in high risk medical devices.
A Study on Performance and Safety Tests of Defibrillator Equipment
Tavakoli Golpaygani, A.; Movahedi, M.M.; Reza, M.
2017-01-01
Introduction: Nowadays, more than 10,000 different types of medical devices can be found in hospitals. This way, medical electrical equipment is being employed in a wide variety of fields in medical sciences with different physiological effects and measurements. Hospitals and medical centers must ensure that their critical medical devices are safe, accurate, reliable and operational at the required level of performance. Defibrillators are critical resuscitation devices. The use of reliable defibirillators has led to more effective treatments and improved patient safety through better control and management of complications during Cardiopulmonary Resuscitation (CPR). Materials and Methods: The metrological reliability of twenty frequent use, manual defibrillators in use ten hospitals (4 private and 6 public) in one of the provinces of Iran according to international and national standards was evaluated. Results: Quantitative analysis of control and instrument accuracy showed the amount of the obtained results in many units are critical which had less value over the standard limitations especially in devices with poor battery. For the accuracy of delivered energy analysis, only twelve units delivered acceptable output values and the precision in the output energy measurements especialy in weak battry condition, after activation of discharge alarm, were low. Conclusion: Obtained results indicate a need for new and severe regulations on periodic performance verifications and medical equipment quality control program especially for high risk instruments. It is also necessary to provide training courses on the fundumentals of operation and performane parameters for medical staff in the field of meterology in medicine and how one can get good accuracy results especially in high risk medical devices. PMID:29445716
Minimizing Bias When Assessing Student Work
ERIC Educational Resources Information Center
Steinke, Pamela; Fitch, Peggy
2017-01-01
Bias is part of the human condition and becoming aware of how to avoid bias will help to ensure greater accuracy in the work of assessment. In this paper the authors discuss three different theoretical frameworks that can be applied when assessing student work for cognitive skills such as critical thinking and problem solving. Each of the…
Accuracy in risk assessment, which is desirable in order to ensure protection of the public health while avoiding over-regulation of economically-important substances, requires quantitatively accurate, in vivo descriptions of dose-response and time-course behaviors. This level of...
40 CFR 98.267 - Records that must be retained.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Monthly mass of phosphate rock consumed by origin (as listed in Table Z-1 of this subpart) (tons). (b) Records of all phosphate rock purchases and/or deliveries (if vertically integrated with a mine). (c) Documentation of the procedures used to ensure the accuracy of monthly phosphate rock consumption by origin, (as...
40 CFR 98.267 - Records that must be retained.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) Monthly mass of phosphate rock consumed by origin (as listed in Table Z-1 of this subpart) (tons). (b) Records of all phosphate rock purchases and/or deliveries (if vertically integrated with a mine). (c) Documentation of the procedures used to ensure the accuracy of monthly phosphate rock consumption by origin, (as...
40 CFR 98.267 - Records that must be retained.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Monthly mass of phosphate rock consumed by origin (as listed in Table Z-1 of this subpart) (tons). (b) Records of all phosphate rock purchases and/or deliveries (if vertically integrated with a mine). (c) Documentation of the procedures used to ensure the accuracy of monthly phosphate rock consumption by origin, (as...
40 CFR 98.267 - Records that must be retained.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) Monthly mass of phosphate rock consumed by origin (as listed in Table Z-1 of this subpart) (tons). (b) Records of all phosphate rock purchases and/or deliveries (if vertically integrated with a mine). (c) Documentation of the procedures used to ensure the accuracy of monthly phosphate rock consumption by origin, (as...
Approvals, Submission, and Important Labeling Changes for US Marketed Pharmaceuticals
Baker, Danial E.
2014-01-01
This monthly feature will help readers keep current on new drugs, new indications, dosage forms, and safety-related changes in labeling or use. Efforts have been made to ensure the accuracy of this information; however, if there are any questions, please let me know at danial.baker@wsu.edu. PMID:25477618
Approvals, Submission, and Important Labeling Changes for US Marketed Pharmaceuticals
Baker, Danial E.
2014-01-01
This monthly feature will help readers keep current on new drugs, new indications, dosage forms, and safety-related changes in labeling or use. Efforts have been made to ensure the accuracy of this information; however, if there are any questions, please let me know at danial.baker@wsu.edu. PMID:24421565
Approvals, Submission, and Important Labeling Changes for US Marketed Pharmaceuticals
Baker, Danial E.
2015-01-01
This monthly feature will help readers keep current on new drugs, new indications, dosage forms, and safety-related changes in labeling or use. Efforts have been made to ensure the accuracy of this information; however, if there are any questions, please let me know at danial.baker@wsu.edu. PMID:27621511
ERIC Educational Resources Information Center
Van Norman, Ethan R.; Nelson, Peter M.; Shin, Jae-Eun; Christ, Theodore J.
2013-01-01
Educators, school psychologists, and other professionals must evaluate student progress and decide to continue, modify, or terminate instructional programs to ensure student success. For this purpose, progress-monitoring data are often collected, plotted graphically, and visually analyzed. The current study evaluated the impact of three common…
Phase-Locking and Coherent Power Combining of Broadband Linearly Chirped Optical Waves
2012-11-05
ensure path-length matching, and we estimate an accuracy of ±2 cm. Fiber-coupled acoustooptic modulators ( Brimrose Corporation) with a nominal...was performed using the VCSEL-based SFL with a chirp rate of ±2×1014 Hz/s, polarization maintaining fiber-optic components, and an AOFS ( Brimrose
Approvals, Submission, and Important Labeling Changes for US Marketed Pharmaceuticals
Baker, Danial E.
2014-01-01
This monthly feature will help readers keep current on new drugs, new indications, dosage forms, and safety-related changes in labeling or use. Efforts have been made to ensure the accuracy of this information; however, if there are any questions, please let me know at danial.baker@wsu.edu. PMID:24958946
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
... prospective accredited agency to complete the form. Total Burden Hours: 190. Estimated Cost (Operation and...)). This program ensures that information is in the desired format, reporting burden (time and costs) is...; The accuracy of OSHA's estimate of the burden (time and costs) of the information collection...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... processing time and immediate data validation to ensure accuracy of the respondent's personal information... respondent based on varying factors in the respondent's personal history. The burden on the respondent is reduced when the respondent's personal history is not relevant to a particular question, since the...
Zhou, Caigen; Zeng, Xiaoqin; Luo, Chaomin; Zhang, Huaguang
In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.
Emulating DC constant power load: a robust sliding mode control approach
NASA Astrophysics Data System (ADS)
Singh, Suresh; Fulwani, Deepak; Kumar, Vinod
2017-09-01
This article presents emulation of a programmable power electronic, constant power load (CPL) using a dc/dc step-up (boost) converter. The converter is controlled by a robust sliding mode controller (SMC). A novel switching surface is proposed to ensure a required power sunk by the converter. The proposed dc CPL is simple in design, has fast dynamic response and high accuracy, and offers an inexpensive alternative to study converters for cascaded dc distribution power system applications. Furthermore, the proposed CPL is sufficiently robust against the input voltage variations. A laboratory prototype of the proposed dc CPL has been developed and validated with SMC realised through OPAL-RT platform. The capability of the proposed dc CPL is confirmed via experimentations in varied scenarios.
Transition to high rate aerospace NDI processes
NASA Astrophysics Data System (ADS)
Vanderheiden, Bert; Thomson, Clint; Ivakhnenko, Igor; Garner, Chuck
2018-04-01
With the rapidly expanding use of carbon fiber composite materials in military and commercial aircraft, processes to manufacture and inspect the structural components must evolve to ensure economic viability. Inspection techniques which were developed to inspect products produced at a rate of one or two structures a month are not fast or flexible enough to inspect more than 8500 parts per month. This presentation describes the evolution of phased array ultrasonic inspection systems to provide the increased rate capacity, the flexibility to accommodate multiple unique designs, and the ability to rapidly adjust to product design changes. The paper will describe how system developments were made in response to new programs resulting in a much less expensive, higher degree of accuracy, increased flexibility, and lower cycle time inspections.
Soni, Jalpa; Purwar, Harsh; Lakhotia, Harshit; Chandel, Shubham; Banerjee, Chitram; Kumar, Uday; Ghosh, Nirmalya
2013-07-01
A novel spectroscopic Mueller matrix system has been developed and explored for both fluorescence and elastic scattering polarimetric measurements from biological tissues. The 4 × 4 Mueller matrix measurement strategy is based on sixteen spectrally resolved (λ = 400 - 800 nm) measurements performed by sequentially generating and analyzing four elliptical polarization states. Eigenvalue calibration of the system ensured high accuracy of Mueller matrix measurement over a broad wavelength range, either for forward or backscattering geometry. The system was explored for quantitative fluorescence and elastic scattering spectroscopic polarimetric studies on normal and precancerous tissue sections from human uterine cervix. The fluorescence spectroscopic Mueller matrices yielded an interesting diattenuation parameter, exhibiting differences between normal and precancerous tissues.
Zaĭtsev, A A; Khodashinskiĭ, I A; Plotnikov, O O
2011-01-01
The importance to have the most efficacious tools and methods for the prevention and treatment of various diseases and rehabilitation of the patients dictates the necessity of search for new means of optimal correction of individual reserves of the organism. One of the approaches to addressing this problem is simulation of prognostication of curative effects of non-medicamental therapy. It is proposed to choose the therapeutic program using an ensemble of classifiers. Two types of them are considered, one based on the solution trees, the other based on the fuzzy rule basis. The software was developed that ensures high accuracy of th e prognosis of the efficiency of the two programs of the spa and resort treatment.
Correlation of analytical and experimental hot structure vibration results
NASA Technical Reports Server (NTRS)
Kehoe, Michael W.; Deaton, Vivian C.
1993-01-01
High surface temperatures and temperature gradients can affect the vibratory characteristics and stability of aircraft structures. Aircraft designers are relying more on finite-element model analysis methods to ensure sufficient vehicle structural dynamic stability throughout the desired flight envelope. Analysis codes that predict these thermal effects must be correlated and verified with experimental data. Experimental modal data for aluminum, titanium, and fiberglass plates heated at uniform, nonuniform, and transient heating conditions are presented. The data show the effect of heat on each plate's modal characteristics, a comparison of predicted and measured plate vibration frequencies, the measured modal damping, and the effect of modeling material property changes and thermal stresses on the accuracy of the analytical results at nonuniform and transient heating conditions.
Tropospheric Wind Monitoring During Day-of-Launch Operations for NASA's Space Shuttle Program
NASA Technical Reports Server (NTRS)
Decker, Ryan; Leach, Richard
2004-01-01
The Environments Group at the National Aeronautics and Space Administration's Marshall Space Flight Center monitors the winds aloft above Kennedy Space Center (KSC) in support of the Space Shuttle Program day-of-launch operations. Assessment of tropospheric winds is used to support the ascent phase of launch. Three systems at KSC are used to generate independent tropospheric wind profiles prior to launch; 1) high resolution jimsphere balloon system, 2) 50-MHz Doppler Radar Wind Profiler (DRWP) and 3) low resolution radiosonde system. All independent sources are compared against each other for accuracy. To assess spatial and temporal wind variability during launch countdown each jimsphere profile is compared against a design wind database to ensure wind change does not violate wind change criteria.
Design of oil pipeline leak detection and communication system based on optical fiber technology
NASA Astrophysics Data System (ADS)
Tu, Yaqing; Chen, Huabo
1999-08-01
The integrity of oil pipeline is always a major concern of operators. Pipeline leak not only leads to loss of oil, but pollutes environment. A new pipeline leak detection and communication system based on optical fiber technology to ensure the pipeline reliability is presented. Combined direct leak detection method with an indirect one, the system will greatly reduce the rate of false alarm. According, to the practical features of oil pipeline,the pipeline communication system is designed employing the state-of-the-art optic fiber communication technology. The system has such feature as high location accuracy of leak detection, good real-time characteristic, etc. which overcomes the disadvantages of traditional leak detection methods and communication system effectively.
Smart catheter flow sensor for real-time continuous regional cerebral blood flow monitoring
NASA Astrophysics Data System (ADS)
Li, Chunyan; Wu, Pei-Ming; Hartings, Jed A.; Wu, Zhizhen; Ahn, Chong H.; LeDoux, David; Shutter, Lori A.; Narayan, Raj K.
2011-12-01
We present a smart catheter flow sensor for real-time, continuous, and quantitative measurement of regional cerebral blood flow using in situ temperature and thermal conductivity compensation. The flow sensor operates in a constant-temperature mode and employs a periodic heating and cooling technique. This approach ensures zero drift and provides highly reliable data with microelectromechanical system-based thin film sensors. The developed flow sensor has a sensitivity of 0.973 mV/ml/100 g/min in the range from 0 to 160 ml/100 g/min with a linear correlation coefficient of R2 = 0.9953. It achieves a resolution of 0.25 ml/100 g/min and an accuracy better than 5 ml/100 g/min.
[Basic considerations during outsourcing of clinical data management services].
Shen, Tong; Liu, Yan
2015-11-01
With worldwide improvements in the regulations of international and domestic clinical trial conductions, the quality of clinical trials and trial data management are receiving a great deal of attention. To ensure the quality of clinical trials, maintain business flexibilities and effectively utilize internal and external resources, the outsourcing model is used in the management of clinical data in operation of pharmaceutical companies. The essential criteria of a successful outsourcing mode in clinical trial are selection of qualified contract research organizations (CRO); establishment of appropriate outsourcing model, and generation of effective quality control systems to ensure the authenticity, integrity and accuracy of the clinical trial data.
Sensor-Based Electromagnetic Navigation (Mediguide®): How Accurate Is It? A Phantom Model Study.
Bourier, Felix; Reents, Tilko; Ammar-Busch, Sonia; Buiatti, Alessandra; Grebmer, Christian; Telishevska, Marta; Brkic, Amir; Semmler, Verena; Lennerz, Carsten; Kaess, Bernhard; Kottmaier, Marc; Kolb, Christof; Deisenhofer, Isabel; Hessling, Gabriele
2015-10-01
Data about localization reproducibility as well as spatial and visual accuracy of the new MediGuide® sensor-based electroanatomic navigation technology are scarce. We therefore sought to quantify these parameters based on phantom experiments. A realistic heart phantom was generated in a 3D-Printer. A CT scan was performed on the phantom. The phantom itself served as ground-truth reference to ensure exact and reproducible catheter placement. A MediGuide® catheter was repeatedly tagged at selected positions to assess accuracy of point localization. The catheter was also used to acquire a MediGuide®-scaled geometry in the EnSite Velocity® electroanatomic mapping system. The acquired geometries (MediGuide®-scaled and EnSite Velocity®-scaled) were compared to a CT segmentation of the phantom to quantify concordance. Distances between landmarks were measured in the EnSite Velocity®- and MediGuide®-scaled geometry and the CT dataset for Bland-Altman comparison. The visualization of virtual MediGuide® catheter tips was compared to their corresponding representation on fluoroscopic cine-loops. Point localization accuracy was 0.5 ± 0.3 mm for MediGuide® and 1.4 ± 0.7 mm for EnSite Velocity®. The 3D accuracy of the geometries was 1.1 ± 1.4 mm (MediGuide®-scaled) and 3.2 ± 1.6 mm (not MediGuide®-scaled). The offset between virtual MediGuide® catheter visualization and catheter representation on corresponding fluoroscopic cine-loops was 0.4 ± 0.1 mm. The MediGuide® system shows a very high level of accuracy regarding localization reproducibility as well as spatial and visual accuracy, which can be ascribed to the magnetic field localization technology. The observed offsets between the geometry visualization and the real phantom are below a clinically relevant threshold. © 2015 Wiley Periodicals, Inc.
Performance of two updated blood glucose monitoring systems: an evaluation following ISO 15197:2013.
Pleus, Stefan; Baumstark, Annette; Rittmeyer, Delia; Jendrike, Nina; Haug, Cornelia; Freckmann, Guido
2016-05-01
Objective For patients with diabetes, regular self-monitoring of blood glucose (SMBG) is essential to ensure adequate glycemic control. Therefore, accurate and reliable blood glucose measurements with SMBG systems are necessary. The international standard ISO 15197 describes requirements for SMBG systems, such as limits within which 95% of glucose results have to fall to reach acceptable system accuracy. The 2013 version of this standard sets higher demands, especially regarding system accuracy, than the currently still valid edition. ISO 15197 can be applied by manufacturers to receive a CE mark for their system. Research design and methods This study was an accuracy evaluation following ISO 15197:2013 section 6.3 of two recently updated SMBG systems (Contour * and Contour TS; Bayer Consumer Care AG, Basel, Switzerland) with an improved algorithm to investigate whether the systems fulfill the requirements of the new standard. For this purpose, capillary blood samples of approximately 100 participants were measured with three test strip lots of both systems and deviations from glucose values obtained with a hexokinase-based comparison method (Cobas Integra † 400 plus; Roche Instrument Center, Rotkreuz, Switzerland) were determined. Percentages of values within the acceptance criteria of ISO 15197:2013 were calculated. This study was registered at clinicaltrials.gov (NCT02358408). Main outcome Both updated systems fulfilled the system accuracy requirements of ISO 15197:2013 as 98.5% to 100% of the results were within the stipulated limits. Furthermore, all results were within the clinically non-critical zones A and B of the consensus error grid for type 1 diabetes. Conclusions The technical improvement of the systems ensured compliance with ISO 15197 in the hands of healthcare professionals even in its more stringent 2013 version. Alternative presentation of system accuracy results in radar plots provides additional information with certain advantages. In addition, the surveillance error grid offers a modern tool to assess a system's clinical performance.
In-die photomask registration and overlay metrology with PROVE using 2D correlation methods
NASA Astrophysics Data System (ADS)
Seidel, D.; Arnz, M.; Beyer, D.
2011-11-01
According to the ITRS roadmap, semiconductor industry drives the 193nm lithography to its limits, using techniques like double exposure, double patterning, mask-source optimization and inverse lithography. For photomask metrology this translates to full in-die measurement capability for registration and critical dimension together with challenging specifications for repeatability and accuracy. Especially, overlay becomes more and more critical and must be ensured on every die. For this, Carl Zeiss SMS has developed the next generation photomask registration and overlay metrology tool PROVE® which serves the 32nm node and below and which is already well established in the market. PROVE® features highly stable hardware components for the stage and environmental control. To ensure in-die measurement capability, sophisticated image analysis methods based on 2D correlations have been developed. In this paper we demonstrate the in-die capability of PROVE® and present corresponding measurement results for shortterm and long-term measurements as well as the attainable accuracy for feature sizes down to 85nm using different illumination modes and mask types. Standard measurement methods based on threshold criteria are compared with the new 2D correlation methods to demonstrate the performance gain of the latter. In addition, mask-to-mask overlay results of typical box-in-frame structures down to 200nm feature size are presented. It is shown, that from overlay measurements a reproducibility budget can be derived that takes into account stage, image analysis and global effects like mask loading and environmental control. The parts of the budget are quantified from measurement results to identify critical error contributions and to focus on the corresponding improvement strategies.
Applicability of Unmanned Aerial Vehicles in Research on Aeolian Processes
NASA Astrophysics Data System (ADS)
Algimantas, Česnulevičius; Artūras, Bautrėnas; Linas, Bevainis; Donatas, Ovodas; Kęstutis, Papšys
2018-02-01
Surface dynamics and instabilities are characteristic of aeolian formation. The method of surface comparison is regarded as the most appropriate one for evaluation of the intensity of aeolian processes and the amount of transported sand. The data for surface comparison can be collected by topographic survey measurements and using unmanned aerial vehicles. Time cost for relief microform fixation and measurement executing topographic survey are very high. The method of unmanned aircraft aerial photographs fixation also encounters difficulties because there are no stable clear objects and contours that enable to link aerial photographs, to determine the boundaries of captured territory and to ensure the accuracy of surface measurements. Creation of stationary anchor points is irrational due to intense sand accumulation and deflation in different climate seasons. In September 2015 and in April 2016 the combined methodology was applied for evaluation of intensity of aeolian processes in the Curonian Spit. Temporary signs (marks) were installed on the surface, coordinates of the marks were fixed using GPS and then flight of unmanned aircraft was conducted. The fixed coordinates of marks ensure the accuracy of measuring aerial imagery and the ability to calculate the possible corrections. This method was used to track and measure very small (micro-rank) relief forms (5-10 cm height and 10-20 cm length). Using this method morphometric indicators of micro-terraces caused by sand dunes pressure to gytia layer were measured in a non-contact way. An additional advantage of the method is the ability to accurately link the repeated measurements. The comparison of 3D terrain models showed sand deflation and accumulation areas and quantitative changes in the terrain very clearly.
Antonelli, Giorgia; Padoan, Andrea; Aita, Ada; Sciacovelli, Laura; Plebani, Mario
2017-08-28
Background The International Standard ISO 15189 is recognized as a valuable guide in ensuring high quality clinical laboratory services and promoting the harmonization of accreditation programmes in laboratory medicine. Examination procedures must be verified in order to guarantee that their performance characteristics are congruent with the intended scope of the test. The aim of the present study was to propose a practice model for implementing procedures employed for the verification of validated examination procedures already used for at least 2 years in our laboratory, in agreement with the ISO 15189 requirement at the Section 5.5.1.2. Methods In order to identify the operative procedure to be used, approved documents were identified, together with the definition of performance characteristics to be evaluated for the different methods; the examination procedures used in laboratory were analyzed and checked for performance specifications reported by manufacturers. Then, operative flow charts were identified to compare the laboratory performance characteristics with those declared by manufacturers. Results The choice of performance characteristics for verification was based on approved documents used as guidance, and the specific purpose tests undertaken, a consideration being made of: imprecision and trueness for quantitative methods; diagnostic accuracy for qualitative methods; imprecision together with diagnostic accuracy for semi-quantitative methods. Conclusions The described approach, balancing technological possibilities, risks and costs and assuring the compliance of the fundamental component of result accuracy, appears promising as an easily applicable and flexible procedure helping laboratories to comply with the ISO 15189 requirements.
Detached Eddy Simulation of the UH-60 Rotor Wake Using Adaptive Mesh Refinement
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.; Ahmad, Jasim U.
2012-01-01
Time-dependent Navier-Stokes flow simulations have been carried out for a UH-60 rotor with simplified hub in forward flight and hover flight conditions. Flexible rotor blades and flight trim conditions are modeled and established by loosely coupling the OVERFLOW Computational Fluid Dynamics (CFD) code with the CAMRAD II helicopter comprehensive code. High order spatial differences, Adaptive Mesh Refinement (AMR), and Detached Eddy Simulation (DES) are used to obtain highly resolved vortex wakes, where the largest turbulent structures are captured. Special attention is directed towards ensuring the dual time accuracy is within the asymptotic range, and verifying the loose coupling convergence process using AMR. The AMR/DES simulation produced vortical worms for forward flight and hover conditions, similar to previous results obtained for the TRAM rotor in hover. AMR proved to be an efficient means to capture a rotor wake without a priori knowledge of the wake shape.
Detection of vancomycin resistances in enterococci within 3 1/2 hours
NASA Astrophysics Data System (ADS)
Schröder, U. -Ch.; Beleites, C.; Assmann, C.; Glaser, U.; Hübner, U.; Pfister, W.; Fritzsche, W.; Popp, J.; Neugebauer, U.
2015-02-01
Vancomycin resistant enterococci (VRE) constitute a challenging problem in health care institutions worldwide. Novel methods to rapidly identify resistances are highly required to ensure an early start of tailored therapy and to prevent further spread of the bacteria. Here, a spectroscopy-based rapid test is presented that reveals resistances of enterococci towards vancomycin within 3.5 hours. Without any specific knowledge on the strain, VRE can be recognized with high accuracy in two different enterococci species. By means of dielectrophoresis, bacteria are directly captured from dilute suspensions, making sample preparation very easy. Raman spectroscopic analysis of the trapped bacteria over a time span of two hours in absence and presence of antibiotics reveals characteristic differences in the molecular response of sensitive as well as resistant Enterococcus faecalis and Enterococcus faecium. Furthermore, the spectroscopic fingerprints provide an indication on the mechanisms of induced resistance in VRE.
Langley 8-foot high-temperature tunnel oxygen measurement system
NASA Technical Reports Server (NTRS)
Sprinkle, Danny R.; Chen, Tony D.; Chaturvedi, Sushil K.
1991-01-01
In order to ensure that there is a proper amount of oxygen necessary for sustaining test engine operation for hypersonic propulsion systems testing at the NASA Langley 8-foot high-temperature tunnel, a quickly responding real-time measurement system of test section oxygen concentration has been designed and tested at Langley. It is built around a zirconium oxide-based sensor which develops a voltage proportional to the oxygen partial pressure of the test gas. The voltage signal is used to control the amount of oxygen being injected into the combustor air. The physical operation of the oxygen sensor is described, as well as the sampling system used to extract the test gas from the tunnel test section. Results of laboratory tests conducted to verify sensor accuracy and response time performance are discussed, as well as the final configuration of the system to be installed in the tunnel.
CARMENES science preparation. High-resolution spectroscopy of M dwarfs
NASA Astrophysics Data System (ADS)
Montes, D.; Caballero, J. A.; Jeffers, S.; Alonso-Floriano, F. J.; Mundt, R.; CARMENES Consortium
2015-05-01
To ensure an efficient use of CARMENES observing time, and the highest chances of success, it is necessary first to select the most promising targets. To achieve this, we are observing 500 M dwarfs at high-resolution (R = 30,000-48,000), from which we determine the projected rotational velocity vsin{i} with an accuracy better than 0.5-0.2 km/s and radial-velocity stability better than 0.2-0.1 km/s. Our aim is to have at least two spectra at different epochs of the final 300 CARMENES targets. Our observations with FEROS at ESO/MPG 2.2 m La Silla, CAFE at 2.2 m Calar Alto and HRS at Hobby Eberly Telescope allow us to identify single- and double-line spectroscopic binaries and, especially, fast rotators, which should be discarded from the target list for exoplanet searches. Here we present preliminary results.
CARMENES at PPVI. High-Resolution Spectroscopy of M Dwarfs with FEROS, CAFE and HRS
NASA Astrophysics Data System (ADS)
Alonso-Floriano, F. J.; Montes, D.; Jeffers, S.; Caballero, J. A.; Zechmeister, M.; Mundt, R.; Reiners, A.; Amado, P. J.; Casal, E.; Cortés-Contreras, M.; Modroño, Z.; Ribas, I.; Rodríguez-López, C.; Quirrenbach, A.
2013-07-01
To ensure an efficient use of CARMENES observing time, and the highest chances of success, it is necessary first to select the most promising targets. To achieve this, we are observing ~500 M dwarfs at high-resolution (R = 30,000-48,000), from which we determine the projected rotational velocity vsini with an accuracy better than 0.5-0.2 km/s and radial-velocity stability better than 0.2-0.1 km/s. Our aim is to have at least two spectra at different epochs of the final 300 CARMENES targets. Our observations with FEROS at ESO/MPG 2.2m La Silla , CAFE at 2.2m Calar Alto and HRS at Hobby Eberly Telescope allow us to identify single- and double-line spectroscopic binaries and, especially, fast rotators, which should be discarded from the target list for exoplanet searches. Here we present preliminary results.
Using machine learning to accelerate sampling-based inversion
NASA Astrophysics Data System (ADS)
Valentine, A. P.; Sambridge, M.
2017-12-01
In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.
Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd
2015-01-01
The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585
Wireless GPS-based phase-locked synchronization system for outdoor environment.
Meyer, Frédéric; Bahr, Alexander; Lochmatter, Thomas; Borrani, Fabio
2012-01-03
Synchronization of data coming from different sources is of high importance in biomechanics to ensure reliable analyses. This synchronization can either be performed through hardware to obtain perfect matching of data, or post-processed digitally. Hardware synchronization can be achieved using trigger cables connecting different devices in many situations; however, this is often impractical, and sometimes impossible in outdoors situations. The aim of this paper is to describe a wireless system for outdoor use, allowing synchronization of different types of - potentially embedded and moving - devices. In this system, each synchronization device is composed of: (i) a GPS receiver (used as time reference), (ii) a radio transmitter, and (iii) a microcontroller. These components are used to provide synchronized trigger signals at the desired frequency to the measurement device connected. The synchronization devices communicate wirelessly, are very lightweight, battery-operated and thus very easy to set up. They are adaptable to every measurement device equipped with either trigger input or recording channel. The accuracy of the system was validated using an oscilloscope. The mean synchronization error was found to be 0.39 μs and pulses are generated with an accuracy of <2 μs. The system provides synchronization accuracy about two orders of magnitude better than commonly used post-processing methods, and does not suffer from any drift in trigger generation. Copyright © 2011 Elsevier Ltd. All rights reserved.
Accurate Radiometry from Space: An Essential Tool for Climate Studies
NASA Technical Reports Server (NTRS)
Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma
2011-01-01
The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance
Weather forecasting based on hybrid neural model
NASA Astrophysics Data System (ADS)
Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.
2017-11-01
Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.
High-accuracy 3-D modeling of cultural heritage: the digitizing of Donatello's "Maddalena".
Guidi, Gabriele; Beraldin, J Angelo; Atzeni, Carlo
2004-03-01
Three-dimensional digital modeling of Heritage works of art through optical scanners, has been demonstrated in recent years with results of exceptional interest. However, the routine application of three-dimensional (3-D) modeling to Heritage conservation still requires the systematic investigation of a number of technical problems. In this paper, the acquisition process of the 3-D digital model of the Maddalena by Donatello, a wooden statue representing one of the major masterpieces of the Italian Renaissance which was swept away by the Florence flood of 1966 and successively restored, is described. The paper reports all the steps of the acquisition procedure, from the project planning to the solution of the various problems due to range camera calibration and to material non optically cooperative. Since the scientific focus is centered on the 3-D model overall dimensional accuracy, a methodology for its quality control is described. Such control has demonstrated how, in some situations, the ICP-based alignment can lead to incorrect results. To circumvent this difficulty we propose an alignment technique based on the fusion of ICP with close-range digital photogrammetry and a non-invasive procedure in order to generate a final accurate model. In the end detailed results are presented, demonstrating the improvement of the final model, and how the proposed sensor fusion ensure a pre-specified level of accuracy.
Bay, Christiane; Kristensen, Peter Lommer; Pedersen-Bjergaard, Ulrik; Tarnow, Lise; Thorsteinsson, Birger
2013-05-01
A reliable method to detect biochemical nocturnal hypoglycemia is highly needed, especially in patients with recurrent severe hypoglycemia. We evaluated reliability of nocturnal continuous glucose monitoring (CGM) in patients with type 1 diabetes at high risk of severe hypoglycemia. Seventy-two type 1 diabetes patients with recurrent severe hypoglycemia (two or more events within the last year) participated for 4 nights in blinded CGM recordings (Guardian(®) REAL-Time CGMS and Sof-Sensor(®); Medtronic MiniMed, Northridge, CA). Blood was drawn hourly from 23:00 to 07:00 h for plasma glucose (PG) measurements (gold standard). Valid data were obtained in 217 nights. The sensitivity of CGM was 65% (95% confidence interval, 53-77%) below 4 mmol/L, 40% (24-56%) below 3 mmol/L, and 17% (0-47%) below 2.2 mmol/L. PG and CGM readings correlated in the total measurement range (Spearman's ρ=0.82; P<0.001). In the normo- and hyperglycemic ranges CGM underestimated PG by 1.1 mmol/L (0.9-1.2 mmol/L) (P<0.001); in contrast, in the hypoglycemic range (PG<4 mmol/L) CGM overestimated PG levels by 1.0 mmol/L (P<0.001). The mean absolute relative differences in the hypo- (≤3.9 mmol/L), normo- (4-9.9 mmol/L), and hyperglycemic (≥10 mmol/L) ranges were 45% (37-53%), 23% (22-25%), and 20% (19-21%), respectively. Continuous glucose error grid analysis indicated a clinical accuracy of 56%, 99%, and 93% in the hypo-, normo-, and hyperglycemic ranges, respectively. The accuracy in the hypoglycemic range of nocturnal CGM data using Sof-Sensor is suboptimal in type 1 diabetes patients at high risk of severe hypoglycemia. To ensure clinical useful sensitivity in detection of nocturnal hypoglycemic episodes, an alarm threshold should not be lower than 4 mmol/L.
Fassett, J D; MacDonald, B S
2001-08-01
The National Institute of Standards and Technology (NIST) has had a major quality-assurance role in the federal effort to reduce lead poisoning of children in the United States through its mission of ensuring the accuracy of chemical measurements. NIST certifies reference materials (standard reference materials--SRMs) that are used to benchmark measurements by secondary and field methods of analysis--to ensure that decisions of great health and economic impact are soundly based on good measurement science. Over the past 10 years, in cooperation with the US Environmental Protection Agency (EPA), US Department of Housing and Urban Development (HUD), and the United States Geological Survey (USGS), NIST has prepared and certified SRMs for lead content in soil, indoor dust, and paint. The role of these materials in meeting regulatory and abatement needs is described and their certified values are summarized.
Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time
NASA Technical Reports Server (NTRS)
Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.;
2017-01-01
To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.
Accuracy of clinical coding for procedures in oral and maxillofacial surgery.
Khurram, S A; Warner, C; Henry, A M; Kumar, A; Mohammed-Ali, R I
2016-10-01
Clinical coding has important financial implications, and discrepancies in the assigned codes can directly affect the funding of a department and hospital. Over the last few years, numerous oversights have been noticed in the coding of oral and maxillofacial (OMF) procedures. To establish the accuracy and completeness of coding, we retrospectively analysed the records of patients during two time periods: March to May 2009 (324 patients), and January to March 2014 (200 patients). Two investigators independently collected and analysed the data to ensure accuracy and remove bias. A large proportion of operations were not assigned all the relevant codes, and only 32% - 33% were correct in both cycles. To our knowledge, this is the first reported audit of clinical coding in OMFS, and it highlights serious shortcomings that have substantial financial implications. Better input by the surgical team and improved communication between the surgical and coding departments will improve accuracy. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Sommella, Eduardo; Pepe, Giacomo; Pagano, Francesco; Tenore, Gian Carlo; Dugo, Paola; Manfra, Michele; Campiglia, Pietro
2013-10-01
We have developed a fast ultra HPLC with ion-trap TOF-MS method for the analysis of flavonoids in Citrus bergamia juice. With respect to the typical methods for the analysis of these matrices based on conventional HPLC techniques, a tenfold faster separation was attained. The use of a core-shell particle column ensured high resolution within the fast analysis time of only 5 min. Unambiguous determination of flavonoid identity was obtained by the employment of a hybrid ion-trap TOF mass spectrometer with high mass accuracy (average error 1.69 ppm). The system showed good retention time and peak area repeatability, with maximum RSD% values of 0.36 and 3.86, respectively, as well as good linearity (R(2) ≥ 0.99). Our results show that ultra HPLC can be a useful tool for ultra fast qualitative/quantitative analysis of flavonoid compounds in citrus fruit juices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-precision radius automatic measurement using laser differential confocal technology
NASA Astrophysics Data System (ADS)
Jiang, Hongwei; Zhao, Weiqian; Yang, Jiamiao; Guo, Yongkui; Xiao, Yang
2015-02-01
A high precision radius automatic measurement method using laser differential confocal technology is proposed. Based on the property of an axial intensity curve that the null point precisely corresponds to the focus of the objective and the bipolar property, the method uses the composite PID (proportional-integral-derivative) control to ensure the steady movement of the motor for process of quick-trigger scanning, and uses least-squares linear fitting to obtain the position of the cat-eye and confocal positions, then calculates the radius of curvature of lens. By setting the number of measure times, precision auto-repeat measurement of the radius of curvature is achieved. The experiment indicates that the method has the measurement accuracy of better than 2 ppm, and the measuring repeatability is better than 0.05 μm. In comparison with the existing manual-single measurement, this method has a high measurement precision, a strong environment anti-interference capability, a better measuring repeatability which is only tenth of former's.
Food and Feed Safety Assessment: The Importance of Proper Sampling.
Kuiper, Harry A; Paoletti, Claudia
2015-03-24
The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.
Lahiri, A; Roy, Abhijit Guha; Sheet, Debdoot; Biswas, Prabir Kumar
2016-08-01
Automated segmentation of retinal blood vessels in label-free fundus images entails a pivotal role in computed aided diagnosis of ophthalmic pathologies, viz., diabetic retinopathy, hypertensive disorders and cardiovascular diseases. The challenge remains active in medical image analysis research due to varied distribution of blood vessels, which manifest variations in their dimensions of physical appearance against a noisy background. In this paper we formulate the segmentation challenge as a classification task. Specifically, we employ unsupervised hierarchical feature learning using ensemble of two level of sparsely trained denoised stacked autoencoder. First level training with bootstrap samples ensures decoupling and second level ensemble formed by different network architectures ensures architectural revision. We show that ensemble training of auto-encoders fosters diversity in learning dictionary of visual kernels for vessel segmentation. SoftMax classifier is used for fine tuning each member autoencoder and multiple strategies are explored for 2-level fusion of ensemble members. On DRIVE dataset, we achieve maximum average accuracy of 95.33% with an impressively low standard deviation of 0.003 and Kappa agreement coefficient of 0.708. Comparison with other major algorithms substantiates the high efficacy of our model.
Investigation of varying gray scale levels for remote manipulation
NASA Technical Reports Server (NTRS)
Bierschwale, John M.; Stuart, Mark A.; Sampaio, Carlos E.
1991-01-01
A study was conducted to investigate the effects of variant monitor gray scale levels and workplace illumination levels on operators' ability to discriminate between different colors on a monochrome monitor. It was determined that 8-gray scale viewing resulted in significantly worse discrimination performance compared to 16- and 32-gray scale viewing and that there was only a negligible difference found between 16 and 32 shades of gray. Therefore, it is recommended that monitors used while performing remote manipulation tasks have 16 or above shades of gray since this evaluation has found levels lower than this to be unacceptable for color discrimination task. There was no significant performance difference found between a high and a low workplace illumination condition. Further analysis was conducted to determine which specific combinations of colors can be used in conjunction with each other to ensure errorfree color coding/brightness discrimination performance while viewing a monochrome monitor. It was found that 92 three-color combination and 9 four-color combinations could be used with 100 percent accuracy. The results can help to determine which gray scale levels should be provided on monochrome monitors as well as which colors to use to ensure the maximal performance of remotely-viewed color discrimination/coding tasks.
Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.
NASA Astrophysics Data System (ADS)
Soni, V.; Hadjadj, A.; Roussel, O.
2017-12-01
In this paper, a fully adaptive multiresolution (MR) finite difference scheme with a time-varying tolerance is developed to study compressible fluid flows containing shock waves in interaction with solid obstacles. To ensure adequate resolution near rigid bodies, the MR algorithm is combined with an immersed boundary method based on a direct-forcing approach in which the solid object is represented by a continuous solid-volume fraction. The resulting algorithm forms an efficient tool capable of solving linear and nonlinear waves on arbitrary geometries. Through a one-dimensional scalar wave equation, the accuracy of the MR computation is, as expected, seen to decrease in time when using a constant MR tolerance considering the accumulation of error. To overcome this problem, a variable tolerance formulation is proposed, which is assessed through a new quality criterion, to ensure a time-convergence solution for a suitable quality resolution. The newly developed algorithm coupled with high-resolution spatial and temporal approximations is successfully applied to shock-bluff body and shock-diffraction problems solving Euler and Navier-Stokes equations. Results show excellent agreement with the available numerical and experimental data, thereby demonstrating the efficiency and the performance of the proposed method.
The role of research in the failure of the alcopops excise in Australia: what have we learned?
Shakeshaft, Anthony; Doran, Christopher M; Byrnes, Joshua
2009-08-17
We believe that a lack of adequate alcohol measures research is partly responsible for the failure of the Australian Government to pass legislation to equalise the excise applied to straight spirits and premixed spirits ("alcopops"). Current measures only assess total alcohol consumption rather than patterns of consumption, and do not adequately identify alcohol-related harm at a population level. Possible solutions include making further efforts to develop applied community-level measures and responding to the repeated calls for national collection and analysis of alcohol sales data. With the Australian Government able to retain the alcopops excise raised to date, there is a unique opportunity for greater collaboration between researchers and government to ensure high-quality and publicly relevant research is funded and conducted to address the current lack of adequate measures research. Measures research is a priority, as this is the basis for increasing the accuracy of data with which more cost-effective public policy and initiatives can be formulated and evaluated. The challenge is for researchers and the Australian Government to align their expertise to ensure revenue from public taxes engenders measurable public health benefit.
Mohamed, Amr E.; Dorrah, Hassen T.
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444
Food and feed safety assessment: the importance of proper sampling.
Kuiper, Harry A; Paoletti, Claudia
2015-01-01
The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ainsworth, Nathan; Hariri, Ali; Prabakar, Kumaraguru
Power hardware-in-the-loop (PHIL) simulation, where actual hardware under text is coupled with a real-time digital model in closed loop, is a powerful tool for analyzing new methods of control for emerging distributed power systems. However, without careful design and compensation of the interface between the simulated and actual systems, PHIL simulations may exhibit instability and modeling inaccuracies. This paper addresses issues that arise in the PHIL simulation of a hardware battery inverter interfaced with a simulated distribution feeder. Both the stability and accuracy issues are modeled and characterized, and a methodology for design of PHIL interface compensation to ensure stabilitymore » and accuracy is presented. The stability and accuracy of the resulting compensated PHIL simulation is then shown by experiment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabakar, Kumaraguru; Ainsworth, Nathan; Pratt, Annabelle
Power hardware-in-the-loop (PHIL) simulation, where actual hardware under text is coupled with a real-time digital model in closed loop, is a powerful tool for analyzing new methods of control for emerging distributed power systems. However, without careful design and compensation of the interface between the simulated and actual systems, PHIL simulations may exhibit instability and modeling inaccuracies. This paper addresses issues that arise in the PHIL simulation of a hardware battery inverter interfaced with a simulated distribution feeder. Both the stability and accuracy issues are modeled and characterized, and a methodology for design of PHIL interface compensation to ensure stabilitymore » and accuracy is presented. The stability and accuracy of the resulting compensated PHIL simulation is then shown by experiment.« less
Hard Copy to Digital Transfer: 3D Models that Match 2D Maps
ERIC Educational Resources Information Center
Kellie, Andrew C.
2011-01-01
This research describes technical drawing techniques applied in a project involving digitizing of existing hard copy subsurface mapping for the preparation of three dimensional graphic and mathematical models. The intent of this research was to identify work flows that would support the project, ensure the accuracy of the digital data obtained,…
Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy
Hassan, Afifa Afifi
1981-01-01
A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)
40 CFR 98.344 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... minutes between samples and determine the methane composition of the landfill gas using one of the methods.... ER30OC09.136 Where: CCH4 = Methane concentration in the landfill gas (volume %) for use in Equation HH-4 of... procedures used to ensure the accuracy of the estimates of disposal quantities and, if applicable, gas flow...
40 CFR 98.344 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... minutes between samples and determine the methane composition of the landfill gas using one of the methods.... ER30OC09.136 Where: CCH4 = Methane concentration in the landfill gas (volume %) for use in Equation HH-4 of... procedures used to ensure the accuracy of the estimates of disposal quantities and, if applicable, gas flow...
Preschool Units EMIS Staff Report. EMIS Staff ECE Units 2005. Report Documentation. Version 1.0
ERIC Educational Resources Information Center
Ohio Department of Education, 2004
2004-01-01
The purpose of Preschool Units EMIS Staff Report is twofold. First, it helps School Districts and Educational Service Centers (ESC) ensure accuracy and validity of preschool staff, student and program data submitted to the Ohio Department of Education (ODE) through the Education Management Information System (EMIS). From this report, school…
Evaluation of Diagnostic Systems: The Selection of Students at Risk of Academic Difficulties
ERIC Educational Resources Information Center
Smolkowski, Keith; Cummings, Kelli D.
2015-01-01
Diagnostic tools can help schools more consistently and fairly match instructional resources to the needs of their students. To ensure the best educational outcome for each child, diagnostic decision-making systems seek to balance time, clarity, and accuracy. However, recent research notes that many educational decisions tend to be made using…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... whether ETCs should be required to apply the Lifeline discount on all of their voice and data packages... governmental data sources would both improve the accuracy of eligibility determinations and ensure that only... discount on all of their service plans, including premium plans and packages that contain services other...
ERIC Educational Resources Information Center
Bethune, Keri S.
2017-01-01
Fidelity of implementation of School-Wide Positive Behavioral Interventions and Supports (SWPBIS) procedures within schools is critical to the success of the program. Coaching has been suggested as one approach to helping ensure accuracy of implementation of SWPBIS plans. This study used a multiple baseline across participants design to examine…
Calibration Techniques for Accurate Measurements by Underwater Camera Systems
Shortis, Mark
2015-01-01
Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172
NASA Astrophysics Data System (ADS)
Wilde, C.; Langehanenberg, P.; Schenk, T.
2017-10-01
For modern production of micro lens systems, such as cementing of doublets or more lenses, precise centering of the lens edge is crucial. Blocking the lens temporarily on a centering arbor ensures that the centers of all optical lens surfaces coincide with the lens edge, while the arbor's axis serves as reference for both alignment and edging process. This theoretical assumption of the traditional cementing technology is not applicable for high-end production. In reality cement wedges between the bottom lens surface and the arbor's ring knife edge may occur and even expensive arbors with single-micron precision suffer from reduced quality of the ring knife edge after multiple usages and cleaning cycles. Consequently, at least the position of the bottom lens surface is undefined and the optical axis does not coincide with the arbor's reference axis! In order to overcome this basic problem in using centering arbors, we present a novel and efficient technique which can measure and align both surfaces of a lens with respect to the arbor axis with high accuracy and furthermore align additional lenses to the optical axis of the bottom lens. This is accomplished by aligning the lens without mechanical contact to the arbor. Thus the lens can be positioned in four degrees of freedom, while the centration errors of all lens surfaces are measured and considered. Additionally the arbor's reference axis is not assumed to be aligned to the rotation axis, but simultaneously measured with high precision.
NASA Astrophysics Data System (ADS)
Wang, Lin; Wu, Wenqi; Wei, Guo; Lian, Junxiang; Yu, Ruihang
2018-05-01
The shipboard redundant rotational inertial navigation system (RINS) configuration, including a dual-axis RINS and a single-axis RINS, can satisfy the demand of marine INSs of especially high reliability as well as achieving trade-off between position accuracy and cost. Generally, the dual-axis RINS is the master INS, and the single-axis RINS is the hot backup INS for high reliability purposes. An integrity monitoring system performs a fault detection function to ensure sailing safety. However, improving the accuracy of the backup INS in case of master INS failure has not been given enough attention. Without the aid of any external information, a systematic bias collaborative measurement method based on an augmented Kalman filter is proposed for the redundant RINSs. Estimates of inertial sensor biases can be used by the built-in integrity monitoring system to monitor the RINS running condition. On the other hand, a position error prediction model is designed for the single-axis RINS to estimate the systematic error caused by its azimuth gyro bias. After position error compensation, the position information provided by the single-axis RINS still remains highly accurate, even if the integrity monitoring system detects a dual-axis RINS fault. Moreover, use of a grid frame as a navigation frame makes the proposed method applicable in any area, including the polar regions. Semi-physical simulation and experiments including sea trials verify the validity of the method.
Enhancing the performance of regional land cover mapping
NASA Astrophysics Data System (ADS)
Wu, Weicheng; Zucca, Claudio; Karam, Fadi; Liu, Guangping
2016-10-01
Different pixel-based, object-based and subpixel-based methods such as time-series analysis, decision-tree, and different supervised approaches have been proposed to conduct land use/cover classification. However, despite their proven advantages in small dataset tests, their performance is variable and less satisfactory while dealing with large datasets, particularly, for regional-scale mapping with high resolution data due to the complexity and diversity in landscapes and land cover patterns, and the unacceptably long processing time. The objective of this paper is to demonstrate the comparatively highest performance of an operational approach based on integration of multisource information ensuring high mapping accuracy in large areas with acceptable processing time. The information used includes phenologically contrasted multiseasonal and multispectral bands, vegetation index, land surface temperature, and topographic features. The performance of different conventional and machine learning classifiers namely Malahanobis Distance (MD), Maximum Likelihood (ML), Artificial Neural Networks (ANNs), Support Vector Machines (SVMs) and Random Forests (RFs) was compared using the same datasets in the same IDL (Interactive Data Language) environment. An Eastern Mediterranean area with complex landscape and steep climate gradients was selected to test and develop the operational approach. The results showed that SVMs and RFs classifiers produced most accurate mapping at local-scale (up to 96.85% in Overall Accuracy), but were very time-consuming in whole-scene classification (more than five days per scene) whereas ML fulfilled the task rapidly (about 10 min per scene) with satisfying accuracy (94.2-96.4%). Thus, the approach composed of integration of seasonally contrasted multisource data and sampling at subclass level followed by a ML classification is a suitable candidate to become an operational and effective regional land cover mapping method.
Castell, Nuria; Dauge, Franck R; Schneider, Philipp; Vogt, Matthias; Lerner, Uri; Fishbain, Barak; Broday, David; Bartonova, Alena
2017-02-01
The emergence of low-cost, user-friendly and very compact air pollution platforms enable observations at high spatial resolution in near-real-time and provide new opportunities to simultaneously enhance existing monitoring systems, as well as engage citizens in active environmental monitoring. This provides a whole new set of capabilities in the assessment of human exposure to air pollution. However, the data generated by these platforms are often of questionable quality. We have conducted an exhaustive evaluation of 24 identical units of a commercial low-cost sensor platform against CEN (European Standardization Organization) reference analyzers, evaluating their measurement capability over time and a range of environmental conditions. Our results show that their performance varies spatially and temporally, as it depends on the atmospheric composition and the meteorological conditions. Our results show that the performance varies from unit to unit, which makes it necessary to examine the data quality of each node before its use. In general, guidance is lacking on how to test such sensor nodes and ensure adequate performance prior to marketing these platforms. We have implemented and tested diverse metrics in order to assess if the sensor can be employed for applications that require high accuracy (i.e., to meet the Data Quality Objectives defined in air quality legislation, epidemiological studies) or lower accuracy (i.e., to represent the pollution level on a coarse scale, for purposes such as awareness raising). Data quality is a pertinent concern, especially in citizen science applications, where citizens are collecting and interpreting the data. In general, while low-cost platforms present low accuracy for regulatory or health purposes they can provide relative and aggregated information about the observed air quality. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong
2017-12-01
The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.
Reliable and valid assessment of point-of-care ultrasonography.
Todsen, Tobias; Tolsgaard, Martin Grønnebæk; Olsen, Beth Härstedt; Henriksen, Birthe Merete; Hillingsø, Jens Georg; Konge, Lars; Jensen, Morten Lind; Ringsted, Charlotte
2015-02-01
To explore the reliability and validity of the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale for point-of-care ultrasonography (POC US) performance. POC US is increasingly used by clinicians and is an essential part of the management of acute surgical conditions. However, the quality of performance is highly operator-dependent. Therefore, reliable and valid assessment of trainees' ultrasonography competence is needed to ensure patient safety. Twenty-four physicians, representing novices, intermediates, and experts in POC US, scanned 4 different surgical patient cases in a controlled set-up. All ultrasound examinations were video-recorded and assessed by 2 blinded radiologists using OSAUS. Reliability was examined using generalizability theory. Construct validity was examined by comparing performance scores between the groups and by correlating physicians' OSAUS scores with diagnostic accuracy. The generalizability coefficient was high (0.81) and a D-study demonstrated that 1 assessor and 5 cases would result in similar reliability. The construct validity of the OSAUS scale was supported by a significant difference in the mean scores between the novice group (17.0; SD 8.4) and the intermediate group (30.0; SD 10.1), P = 0.007, as well as between the intermediate group and the expert group (72.9; SD 4.4), P = 0.04, and by a high correlation between OSAUS scores and diagnostic accuracy (Spearman ρ correlation coefficient = 0.76; P < 0.001). This study demonstrates high reliability as well as evidence of construct validity of the OSAUS scale for assessment of POC US competence. Hence, the OSAUS scale may be suitable for both in-training as well as end-of-training assessment.
Refractive laser beam shaping by means of a functional differential equation based design approach.
Duerr, Fabian; Thienpont, Hugo
2014-04-07
Many laser applications require specific irradiance distributions to ensure optimal performance. Geometric optical design methods based on numerical calculation of two plano-aspheric lenses have been thoroughly studied in the past. In this work, we present an alternative new design approach based on functional differential equations that allows direct calculation of the rotational symmetric lens profiles described by two-point Taylor polynomials. The formalism is used to design a Gaussian to flat-top irradiance beam shaping system but also to generate a more complex dark-hollow Gaussian (donut-like) irradiance distribution with zero intensity in the on-axis region. The presented ray tracing results confirm the high accuracy of both calculated solutions and emphasize the potential of this design approach for refractive beam shaping applications.
Position Control of Tendon-Driven Fingers
NASA Technical Reports Server (NTRS)
Abdallah, Muhammad E.; Platt, Robert, Jr.; Hargrave, B.; Pementer, Frank
2011-01-01
Conventionally, tendon-driven manipulators implement some force control scheme based on tension feedback. This feedback allows the system to ensure that the tendons are maintained taut with proper levels of tensioning at all times. Occasionally, whether it is due to the lack of tension feedback or the inability to implement sufficiently high stiffnesses, a position control scheme is needed. This work compares three position controllers for tendon-driven manipulators. A new controller is introduced that achieves the best overall performance with regards to speed, accuracy, and transient behavior. To compensate for the lack of tension feedback, the controller nominally maintains the internal tension on the tendons by implementing a two-tier architecture with a range-space constraint. These control laws are validated experimentally on the Robonaut-2 humanoid hand. I
Absorbance and fluorometric sensing with capillary wells microplates.
Tan, Han Yen; Cheong, Brandon Huey-Ping; Neild, Adrian; Liew, Oi Wah; Ng, Tuck Wah
2010-12-01
Detection and readout from small volume assays in microplates are a challenge. The capillary wells microplate approach [Ng et al., Appl. Phys. Lett. 93, 174105 (2008)] offers strong advantages in small liquid volume management. An adapted design is described and shown here to be able to detect, in a nonimaging manner, fluorescence and absorbance assays minus the error often associated with meniscus forming at the air-liquid interface. The presence of bubbles in liquid samples residing in microplate wells can cause inaccuracies. Pipetting errors, if not adequately managed, can result in misleading data and wrong interpretations of assay results; particularly in the context of high throughput screening. We show that the adapted design is also able to detect for bubbles and pipetting errors during actual assay runs to ensure accuracy in screening.
Cross-coupled control for all-terrain rovers.
Reina, Giulio
2013-01-08
Mobile robots are increasingly being used in challenging outdoor environments for applications that include construction, mining, agriculture, military and planetary exploration. In order to accomplish the planned task, it is critical that the motion control system ensure accuracy and robustness. The achievement of high performance on rough terrain is tightly connected with the minimization of vehicle-terrain dynamics effects such as slipping and skidding. This paper presents a cross-coupled controller for a 4-wheel-drive/4-wheel-steer robot, which optimizes the wheel motors' control algorithm to reduce synchronization errors that would otherwise result in wheel slip with conventional controllers. Experimental results, obtained with an all-terrain rover operating on agricultural terrain, are presented to validate the system. It is shown that the proposed approach is effective in reducing slippage and vehicle posture errors.
PREMIX: PRivacy-preserving EstiMation of Individual admiXture.
Chen, Feng; Dow, Michelle; Ding, Sijie; Lu, Yao; Jiang, Xiaoqian; Tang, Hua; Wang, Shuang
2016-01-01
In this paper we proposed a framework: PRivacy-preserving EstiMation of Individual admiXture (PREMIX) using Intel software guard extensions (SGX). SGX is a suite of software and hardware architectures to enable efficient and secure computation over confidential data. PREMIX enables multiple sites to securely collaborate on estimating individual admixture within a secure enclave inside Intel SGX. We implemented a feature selection module to identify most discriminative Single Nucleotide Polymorphism (SNP) based on informativeness and an Expectation Maximization (EM)-based Maximum Likelihood estimator to identify the individual admixture. Experimental results based on both simulation and 1000 genome data demonstrated the efficiency and accuracy of the proposed framework. PREMIX ensures a high level of security as all operations on sensitive genomic data are conducted within a secure enclave using SGX.
Extended pseudo-screen migration with multiple reference velocities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lian-Jie; Fehler, M.C.
1997-11-01
The pseudo-screen propagator is a kind of one way wave propagation based on the local Born approximation. The problem of the propagator is that it is difficult to calculate the scattered fields when the velocity perturbation is large; not to mention the accuracy of the propagator. We develop an extended pseudo-screen propagator by introducing different reference velocities in different regions of a medium to ensure the condition of small perturbation. The exploding reflector data for a 2D slice of the SEG/EAEG 3D salt model is generated by a finite difference scheme to test the feasibility of the method. The migrationmore » result demonstrates that the method can handle severe lateral velocity variations and provides high quality images for complex structures.« less
Mars Science Laboratory Propulsive Maneuver Design and Execution
NASA Technical Reports Server (NTRS)
Wong, Mau C.; Kangas, Julie A.; Ballard, Christopher G.; Gustafson, Eric D.; Martin-Mur, Tomas J.
2012-01-01
The NASA Mars Science Laboratory (MSL) rover, Curiosity, was launched on November 26, 2011 and successfully landed at the Gale Crater on Mars. For the 8-month interplanetary trajectory from Earth to Mars, five nominal and two contingency trajectory correction maneuvers (TCM) were planned. The goal of these TCMs was to accurately deliver the spacecraft to the desired atmospheric entry aimpoint in Martian atmosphere so as to ensure a high probability of successful landing on the Mars surface. The primary mission requirements on maneuver performance were the total mission propellant usage and the entry flight path angle (EFPA) delivery accuracy. They were comfortably met in this mission. In this paper we will describe the spacecraft propulsion system, TCM constraints and requirements, TCM design processes, and their implementation and verification.
H- photodetachment and radiative attachment for astrophysical applications
NASA Astrophysics Data System (ADS)
McLaughlin, B. M.; Stancil, P. C.; Sadeghpour, H. R.; Forrey, R. C.
2017-06-01
We combine R-matrix calculations, asymptotic relations, and comparison to available experimental data to construct an H- photodetachment cross section reliable over a large range of photon energies and take into account the series of auto-detaching shape and Feshbach resonances between 10.92 and 14.35 eV. The accuracy of the cross section is controlled by ensuring that it satisfies all known oscillator strength sum rules, including contributions from the resonances and single-photon double-electron photodetachment. From the resulting recommended cross section, spontaneous and stimulated radiative attachment rate coefficients are obtained. Photodetachment rates are also computed for the standard interstellar radiation field, in diffuse and dense interstellar clouds, for blackbody radiation, and for high redshift distortion photons in the recombination epoch. Implications are investigated for these astrophysical radiation fields and epochs.
Automated Boundary Conditions for Wind Tunnel Simulations
NASA Technical Reports Server (NTRS)
Carlson, Jan-Renee
2018-01-01
Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.
Mc Fadden, Kim; Gillespie, John; Carney, Brian; O'Driscoll, Daniel
2006-07-07
A rapid and selective HPLC method using monolithic columns was developed for the separation and quantification of the principal amphetamines in ecstasy tablets. Three monolithic (Chromolith RP18e) columns of different lengths (25, 50 and 100 mm) were assessed. Validation studies including linearity, selectivity, precision, accuracy and limit of detection and quantification were carried out using the Chromolith SpeedROD, RP-18e, 50 mm x 4.6 mm column. Column backpressure and van Deemter plots demonstrated that monolithic columns provide higher efficiency at higher flow rates when compared to particulate columns without the loss of peak resolution. Application of the monolithic column to a large number of ecstasy tablets seized in Ireland ensured its suitability for the routine analysis of ecstasy tablets.
Rapid and accurate synthesis of TALE genes from synthetic oligonucleotides.
Wang, Fenghua; Zhang, Hefei; Gao, Jingxia; Chen, Fengjiao; Chen, Sijie; Zhang, Cuizhen; Peng, Gang
2016-01-01
Custom synthesis of transcription activator-like effector (TALE) genes has relied upon plasmid libraries of pre-fabricated TALE-repeat monomers or oligomers. Here we describe a novel synthesis method that directly incorporates annealed synthetic oligonucleotides into the TALE-repeat units. Our approach utilizes iterative sets of oligonucleotides and a translational frame check strategy to ensure the high efficiency and accuracy of TALE-gene synthesis. TALE arrays of more than 20 repeats can be constructed, and the majority of the synthesized constructs have perfect sequences. In addition, this novel oligonucleotide-based method can readily accommodate design changes to the TALE repeats. We demonstrated an increased gene targeting efficiency against a genomic site containing a potentially methylated cytosine by incorporating non-conventional repeat variable di-residue (RVD) sequences.
Zhou, Shenghan; Qian, Silin; Chang, Wenbing; Xiao, Yiyong; Cheng, Yang
2018-06-14
Timely and accurate state detection and fault diagnosis of rolling element bearings are very critical to ensuring the reliability of rotating machinery. This paper proposes a novel method of rolling bearing fault diagnosis based on a combination of ensemble empirical mode decomposition (EEMD), weighted permutation entropy (WPE) and an improved support vector machine (SVM) ensemble classifier. A hybrid voting (HV) strategy that combines SVM-based classifiers and cloud similarity measurement (CSM) was employed to improve the classification accuracy. First, the WPE value of the bearing vibration signal was calculated to detect the fault. Secondly, if a bearing fault occurred, the vibration signal was decomposed into a set of intrinsic mode functions (IMFs) by EEMD. The WPE values of the first several IMFs were calculated to form the fault feature vectors. Then, the SVM ensemble classifier was composed of binary SVM and the HV strategy to identify the bearing multi-fault types. Finally, the proposed model was fully evaluated by experiments and comparative studies. The results demonstrate that the proposed method can effectively detect bearing faults and maintain a high accuracy rate of fault recognition when a small number of training samples are available.
A frequency-domain approach to improve ANNs generalization quality via proper initialization.
Chaari, Majdi; Fekih, Afef; Seibi, Abdennour C; Hmida, Jalel Ben
2018-08-01
The ability to train a network without memorizing the input/output data, thereby allowing a good predictive performance when applied to unseen data, is paramount in ANN applications. In this paper, we propose a frequency-domain approach to evaluate the network initialization in terms of quality of training, i.e., generalization capabilities. As an alternative to the conventional time-domain methods, the proposed approach eliminates the approximate nature of network validation using an excess of unseen data. The benefits of the proposed approach are demonstrated using two numerical examples, where two trained networks performed similarly on the training and the validation data sets, yet they revealed a significant difference in prediction accuracy when tested using a different data set. This observation is of utmost importance in modeling applications requiring a high degree of accuracy. The efficiency of the proposed approach is further demonstrated on a real-world problem, where unlike other initialization methods, a more conclusive assessment of generalization is achieved. On the practical front, subtle methodological and implementational facets are addressed to ensure reproducibility and pinpoint the limitations of the proposed approach. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.
2011-03-01
The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-
Xingling, Shao; Honglun, Wang
2014-11-01
This paper proposes a novel hybrid control framework by combing observer-based sliding mode control (SMC) with trajectory linearization control (TLC) for hypersonic reentry vehicle (HRV) attitude tracking problem. First, fewer control consumption is achieved using nonlinear tracking differentiator (TD) in the attitude loop. Second, a novel SMC that employs extended disturbance observer (EDO) to counteract the effect of uncertainties using a new sliding surface which includes the estimation error is integrated to address the tracking error stabilization issues in the attitude and angular rate loop, respectively. In addition, new results associated with EDO are examined in terms of dynamic response and noise-tolerant performance, as well as estimation accuracy. The key feature of the proposed compound control approach is that chattering free tracking performance with high accuracy can be ensured for HRV in the presence of multiple uncertainties under control constraints. Based on finite time convergence stability theory, the stability of the resulting closed-loop system is well established. Also, comparisons and extensive simulation results are presented to demonstrate the effectiveness of the control strategy. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hishe, Hadgu; Giday, Kidane; Neka, Mulugeta; Soromessa, Teshome; Van Orshoven, Jos; Muys, Bart
2015-01-01
Comprehensive and less costly forest inventory approaches are required to monitor the spatiotemporal dynamics of key species in forest ecosystems. Subpixel analysis using the earth resources data analysis system imagine subpixel classification procedure was tested to extract Olea europaea subsp. cuspidata and Juniperus procera canopies from Landsat 7 enhanced thematic mapper plus imagery. Control points with various canopy area fractions of the target species were collected to develop signatures for each of the species. With these signatures, the imagine subpixel classification procedure was run for each species independently. The subpixel process enabled the detection of O. europaea subsp. cuspidata and J. procera trees in pure and mixed pixels. Total of 100 pixels each were field verified for both species. An overall accuracy of 85% was achieved for O. europaea subsp. cuspidata and 89% for J. procera. A high overall accuracy level of detecting species at a natural forest was achieved, which encourages using the algorithm for future species monitoring activities. We recommend that the algorithm has to be validated in similar environment to enrich the knowledge on its capability to ensure its wider usage.
Laurinavicius, Arvydas; Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Dasevicius, Darius; Elie, Nicolas; Iqbal, Yasir; Bor, Catherine
2014-01-01
Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists' VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particularfor the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers.
2014-01-01
Introduction Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Methods Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. Results ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists’ VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particular for the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Conclusions Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers. PMID:24708745
Developing a Data Set and Processing Methodology for Fluid/Structure Interaction Code Validation
2007-06-01
50 29. 9-Probe Wake Survey Rake Configurations...structural stability and fatigue in test article components and, in general, in facility support structures and rotating machinery blading . Both T&E... blade analysis and simulations. To ensure the accuracy of the U of CO technology, validation using flight-test data and test data from a wind tunnel
The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.
ERIC Educational Resources Information Center
System Development Corp., Falls Church, VA.
An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…
Digital Literacies. A Tale of Two Tasks: Editing in the Era of Digital Literacies
ERIC Educational Resources Information Center
Chandler-Olcott, Kelly
2009-01-01
This article argues that editing in the era of digital literacies is a complex, collaborative endeavor that requires a sophisticated awareness of audience and purpose and a knowledge of multiple conventions for conveying meaning and ensuring accuracy. It compares group editing of an article about the New York Yankees baseball team on Wikipedia,…
NASA Astrophysics Data System (ADS)
Giono, G.; Ishikawa, R.; Narukage, N.; Kano, R.; Katsukawa, Y.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; Winebarger, A.; Kobayashi, K.; Auchère, F.; Trujillo Bueno, J.; Tsuneta, S.; Shimizu, T.; Sakao, T.; Cirtain, J.; Champey, P.; Asensio Ramos, A.; Štěpán, J.; Belluzzi, L.; Manso Sainz, R.; De Pontieu, B.; Ichimoto, K.; Carlsson, M.; Casini, R.; Goto, M.
2017-04-01
The Chromospheric Lyman-Alpha SpectroPolarimeter is a sounding rocket instrument designed to measure for the first time the linear polarization of the hydrogen Lyman-{α} line (121.6 nm). The instrument was successfully launched on 3 September 2015 and observations were conducted at the solar disc center and close to the limb during the five-minutes flight. In this article, the disc center observations are used to provide an in-flight calibration of the instrument spurious polarization. The derived in-flight spurious polarization is consistent with the spurious polarization levels determined during the pre-flight calibration and a statistical analysis of the polarization fluctuations from solar origin is conducted to ensure a 0.014% precision on the spurious polarization. The combination of the pre-flight and the in-flight polarization calibrations provides a complete picture of the instrument response matrix, and a proper error transfer method is used to confirm the achieved polarization accuracy. As a result, the unprecedented 0.1% polarization accuracy of the instrument in the vacuum ultraviolet is ensured by the polarization calibration.
NASA Astrophysics Data System (ADS)
Xu, Jiayuan; Yu, Chengtao; Bo, Bin; Xue, Yu; Xu, Changfu; Chaminda, P. R. Dushantha; Hu, Chengbo; Peng, Kai
2018-03-01
The automatic recognition of the high voltage isolation switch by remote video monitoring is an effective means to ensure the safety of the personnel and the equipment. The existing methods mainly include two ways: improving monitoring accuracy and adopting target detection technology through equipment transformation. Such a method is often applied to specific scenarios, with limited application scope and high cost. To solve this problem, a high voltage isolation switch state recognition method based on background difference and iterative search is proposed in this paper. The initial position of the switch is detected in real time through the background difference method. When the switch starts to open and close, the target tracking algorithm is used to track the motion trajectory of the switch. The opening and closing state of the switch is determined according to the angle variation of the switch tracking point and the center line. The effectiveness of the method is verified by experiments on different switched video frames of switching states. Compared with the traditional methods, this method is more robust and effective.
Improving accuracy of clinical coding in surgery: collaboration is key.
Heywood, Nick A; Gill, Michael D; Charlwood, Natasha; Brindle, Rachel; Kirwan, Cliona C
2016-08-01
Clinical coding data provide the basis for Hospital Episode Statistics and Healthcare Resource Group codes. High accuracy of this information is required for payment by results, allocation of health and research resources, and public health data and planning. We sought to identify the level of accuracy of clinical coding in general surgical admissions across hospitals in the Northwest of England. Clinical coding departments identified a total of 208 emergency general surgical patients discharged between 1st March and 15th August 2013 from seven hospital trusts (median = 20, range = 16-60). Blinded re-coding was performed by a senior clinical coder and clinician, with results compared with the original coding outcome. Recorded codes were generated from OPCS-4 & ICD-10. Of all cases, 194 of 208 (93.3%) had at least one coding error and 9 of 208 (4.3%) had errors in both primary diagnosis and primary procedure. Errors were found in 64 of 208 (30.8%) of primary diagnoses and 30 of 137 (21.9%) of primary procedure codes. Median tariff using original codes was £1411.50 (range, £409-9138). Re-calculation using updated clinical codes showed a median tariff of £1387.50, P = 0.997 (range, £406-10,102). The most frequent reasons for incorrect coding were "coder error" and a requirement for "clinical interpretation of notes". Errors in clinical coding are multifactorial and have significant impact on primary diagnosis, potentially affecting the accuracy of Hospital Episode Statistics data and in turn the allocation of health care resources and public health planning. As we move toward surgeon specific outcomes, surgeons should increase collaboration with coding departments to ensure the system is robust. Copyright © 2016 Elsevier Inc. All rights reserved.
Achievable accuracy of hip screw holding power estimation by insertion torque measurement.
Erani, Paolo; Baleani, Massimiliano
2018-02-01
To ensure stability of proximal femoral fractures, the hip screw must firmly engage into the femoral head. Some studies suggested that screw holding power into trabecular bone could be evaluated, intraoperatively, through measurement of screw insertion torque. However, those studies used synthetic bone, instead of trabecular bone, as host material or they did not evaluate accuracy of predictions. We determined prediction accuracy, also assessing the impact of screw design and host material. We measured, under highly-repeatable experimental conditions, disregarding clinical procedure complexities, insertion torque and pullout strength of four screw designs, both in 120 synthetic and 80 trabecular bone specimens of variable density. For both host materials, we calculated the root-mean-square error and the mean-absolute-percentage error of predictions based on the best fitting model of torque-pullout data, in both single-screw and merged dataset. Predictions based on screw-specific regression models were the most accurate. Host material impacts on prediction accuracy: the replacement of synthetic with trabecular bone decreased both root-mean-square errors, from 0.54 ÷ 0.76 kN to 0.21 ÷ 0.40 kN, and mean-absolute-percentage errors, from 14 ÷ 21% to 10 ÷ 12%. However, holding power predicted on low insertion torque remained inaccurate, with errors up to 40% for torques below 1 Nm. In poor-quality trabecular bone, tissue inhomogeneities likely affect pullout strength and insertion torque to different extents, limiting the predictive power of the latter. This bias decreases when the screw engages good-quality bone. Under this condition, predictions become more accurate although this result must be confirmed by close in-vitro simulation of the clinical procedure. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Yuchen; Zemmamouche, Redouane; Vandenrijt, Jean-François; Georges, Marc P.
2018-05-01
A combination of digital holographic interferometry (DHI) and digital speckle photography (DSP) allows in-plane and out-of-plane displacement measurement between two states of an object. The former can be determined by correlating the two speckle patterns whereas the latter is given by the phase difference obtained from DHI. We show that the amplitude of numerically reconstructed object wavefront obtained from Fresnel in-line digital holography (DH), in combination with phase shifting techniques, can be used as speckle patterns in DSP. The accuracy of in-plane measurement is improved after correcting the phase errors induced by reference wave during reconstruction process. Furthermore, unlike conventional imaging system, Fresnel DH offers the possibility to resize the pixel size of speckle patterns situated on the reconstruction plane under the same optical configuration simply by zero-padding the hologram. The flexibility of speckle size adjustment in Fresnel DH ensures the accuracy of estimation result using DSP.
A joint tracking method for NSCC based on WLS algorithm
NASA Astrophysics Data System (ADS)
Luo, Ruidan; Xu, Ying; Yuan, Hong
2017-12-01
Navigation signal based on compound carrier (NSCC), has the flexible multi-carrier scheme and various scheme parameters configuration, which enables it to possess significant efficiency of navigation augmentation in terms of spectral efficiency, tracking accuracy, multipath mitigation capability and anti-jamming reduction compared with legacy navigation signals. Meanwhile, the typical scheme characteristics can provide auxiliary information for signal synchronism algorithm design. This paper, based on the characteristics of NSCC, proposed a kind of joint tracking method utilizing Weighted Least Square (WLS) algorithm. In this method, the LS algorithm is employed to jointly estimate each sub-carrier frequency shift with the frequency-Doppler linear relationship, by utilizing the known sub-carrier frequency. Besides, the weighting matrix is set adaptively according to the sub-carrier power to ensure the estimation accuracy. Both the theory analysis and simulation results illustrate that the tracking accuracy and sensitivity of this method outperforms the single-carrier algorithm with lower SNR.
Jensen, Jo Anne G; Moreno, Elizabeth L; Rice, Tara M
2014-03-01
The Office of Adolescent Health (OAH) developed a systematic approach to review for medical accuracy the educational materials proposed for use in Teen Pregnancy Prevention (TPP) programs. This process is also used by the Administration on Children, Youth, and Families (ACYF) for review of materials used in the Personal Responsibility Education Innovative Strategies (PREIS) Program. This article describes the review process, explaining the methodology, the team implementing the reviews, and the process for distributing review findings and implementing changes. Provided also is the definition of "medically accurate and complete" as used in the programs, and a description of what constitutes "complete" information when discussing sexually transmitted infections and birth control methods. The article is of interest to program providers, curriculum developers and purveyors, and those who are interested in providing medically accurate and complete information to adolescents. Published by Elsevier Inc.
Innovative Technology Transfer Partnerships
NASA Technical Reports Server (NTRS)
Kohler, Jeff
2004-01-01
The National Aeronautics and Space Administration (NASA) seeks to license its Advanced Tire and Strut Pressure Monitor (TSPM) technology. The TSPM is a handheld system to accurately measure tire and strut pressure and temperature over a wide temperature range (20 to 120 OF), as well as improve personnel safety. Sensor accuracy, electronics design, and a simple user interface allow operators quick, easy access to required measurements. The handheld electronics, powered by 12-VAC or by 9-VDC batteries, provide the user with an easy-to-read visual display of pressure/temperature or the streaming of pressure/temperature data via an RS-232 interface. When connected to a laptop computer, this new measurement system can provide users with automated data recording and trending, eliminating the chance for data hand-recording errors. In addition, calibration software allows for calibration data to be automatically utilized for the generation of new data conversion equations, simplifying the calibration processes that are so critical to reliable measurements. The design places a high-accuracy pressure sensor (also used as a temperature sensor) as close to the tire or strut measurement location as possible, allowing the user to make accurate measurements rapidly, minimizing the amount of high-pressure volumes, and allowing reasonable distance between the tire or strut and the operator. The pressure sensor attaches directly to the pressure supply/relief valve on the tire and/or strut, with necessary electronics contained in the handheld enclosure. A software algorithm ensures high accuracy of the device over the wide temperature range. Using the pressure sensor as a temperature sensor permits measurement of the actual temperature of the pressurized gas. This device can be adapted to create a portable calibration standard that does not require thermal conditioning. This allows accurate pressure measurements without disturbing the gas temperature. In-place calibration can save considerable time and money and is suitable in many process applications throughout industry.
Lasch, Peter; Wahab, Tara; Weil, Sandra; Pályi, Bernadett; Tomaso, Herbert; Zange, Sabine; Kiland Granerud, Beathe; Drevinek, Michal; Kokotovic, Branko; Wittwer, Matthias; Pflüger, Valentin; Di Caro, Antonino; Stämmler, Maren; Grunow, Roland
2015-01-01
In the case of a release of highly pathogenic bacteria (HPB), there is an urgent need for rapid, accurate, and reliable diagnostics. MALDI-TOF mass spectrometry is a rapid, accurate, and relatively inexpensive technique that is becoming increasingly important in microbiological diagnostics to complement classical microbiology, PCR, and genotyping of HPB. In the present study, the results of a joint exercise with 11 partner institutions from nine European countries are presented. In this exercise, 10 distinct microbial samples, among them five HPB, Bacillus anthracis, Brucella canis, Burkholderia mallei, Burkholderia pseudomallei, and Yersinia pestis, were characterized under blinded conditions. Microbial strains were inactivated by high-dose gamma irradiation before shipment. Preparatory investigations ensured that this type of inactivation induced only subtle spectral changes with negligible influence on the quality of the diagnosis. Furthermore, pilot tests on nonpathogenic strains were systematically conducted to ensure the suitability of sample preparation and to optimize and standardize the workflow for microbial identification. The analysis of the microbial mass spectra was carried out by the individual laboratories on the basis of spectral libraries available on site. All mass spectra were also tested against an in-house HPB library at the Robert Koch Institute (RKI). The averaged identification accuracy was 77% in the first case and improved to >93% when the spectral diagnoses were obtained on the basis of the RKI library. The compilation of complete and comprehensive databases with spectra from a broad strain collection is therefore considered of paramount importance for accurate microbial identification. PMID:26063856
New perspectives for high accuracy SLR with second generation geodesic satellites
NASA Technical Reports Server (NTRS)
Lund, Glenn
1993-01-01
This paper reports on the accuracy limitations imposed by geodesic satellite signatures, and on the potential for achieving millimetric performances by means of alternative satellite concepts and an optimized 2-color system tradeoff. Long distance laser ranging, when performed between a ground (emitter/receiver) station and a distant geodesic satellite, is now reputed to enable short arc trajectory determinations to be achieved with an accuracy of 1 to 2 centimeters. This state-of-the-art accuracy is limited principally by the uncertainties inherent to single-color atmospheric path length correction. Motivated by the study of phenomena such as postglacial rebound, and the detailed analysis of small-scale volcanic and strain deformations, the drive towards millimetric accuracies will inevitably be felt. With the advent of short pulse (less than 50 ps) dual wavelength ranging, combined with adequate detection equipment (such as a fast-scanning streak camera or ultra-fast solid-state detectors) the atmospheric uncertainty could potentially be reduced to the level of a few millimeters, thus, exposing other less significant error contributions, of which by far the most significant will then be the morphology of the retroreflector satellites themselves. Existing geodesic satellites are simply dense spheres, several 10's of cm in diameter, encrusted with a large number (426 in the case of LAGEOS) of small cube-corner reflectors. A single incident pulse, thus, results in a significant number of randomly phased, quasi-simultaneous return pulses. These combine coherently at the receiver to produce a convolved interference waveform which cannot, on a shot to shot basis, be accurately and unambiguously correlated to the satellite center of mass. This paper proposes alternative geodesic satellite concepts, based on the use of a very small number of cube-corner retroreflectors, in which the above difficulties are eliminated while ensuring, for a given emitted pulse, the return of a single clean pulse with an adequate cross-section.
AVHRR composite period selection for land cover classification
Maxwell, S.K.; Hoffer, R.M.; Chapman, P.L.
2002-01-01
Multitemporal satellite image datasets provide valuable information on the phenological characteristics of vegetation, thereby significantly increasing the accuracy of cover type classifications compared to single date classifications. However, the processing of these datasets can become very complex when dealing with multitemporal data combined with multispectral data. Advanced Very High Resolution Radiometer (AVHRR) biweekly composite data are commonly used to classify land cover over large regions. Selecting a subset of these biweekly composite periods may be required to reduce the complexity and cost of land cover mapping. The objective of our research was to evaluate the effect of reducing the number of composite periods and altering the spacing of those composite periods on classification accuracy. Because inter-annual variability can have a major impact on classification results, 5 years of AVHRR data were evaluated. AVHRR biweekly composite images for spectral channels 1-4 (visible, near-infrared and two thermal bands) covering the entire growing season were used to classify 14 cover types over the entire state of Colorado for each of five different years. A supervised classification method was applied to maintain consistent procedures for each case tested. Results indicate that the number of composite periods can be halved-reduced from 14 composite dates to seven composite dates-without significantly reducing overall classification accuracy (80.4% Kappa accuracy for the 14-composite data-set as compared to 80.0% for a seven-composite dataset). At least seven composite periods were required to ensure the classification accuracy was not affected by inter-annual variability due to climate fluctuations. Concentrating more composites near the beginning and end of the growing season, as compared to using evenly spaced time periods, consistently produced slightly higher classification values over the 5 years tested (average Kappa) of 80.3% for the heavy early/late case as compared to 79.0% for the alternate dataset case).
Wang, Diya; Xiao, Mengnan; Hu, Hong; Zhang, Yu; Su, Zhe; Xu, Shanshan; Zong, Yujin; Wan, Mingxi
2018-03-01
This study aimed to develop a focal microvascular contrast-enhanced ultrasonic parametric perfusion imaging (PPI) scheme to overcome the tradeoff between the resolution, contrast, and accuracy of focal PPI in the tumor. Its resolution was limited by the low signal-to-clutter ratio (SCR) of time-intensity-curves (TICs) induced by multiple limitations, which deteriorated the accuracy and contrast of focal PPI. The scheme was verified by the in-vivo perfusion experiments. Single-pixel TICs were first extracted to ensure PPI with the highest resolution. The SCR of focal TICs in the tumor was improved using respiratory motion compensation combined with detrended fluctuation analysis. The entire and focal PPIs of six perfusion parameters were then accurately created after filtrating the valid TICs and targeted perfusion parameters. Compared with those of the conventional PPIs, the axial and lateral resolutions of focal PPIs were improved by 30.29% (p < .05) and 32.77% (p < .05), respectively; the average contrast and accuracy evaluated by SCR improved by 7.24 ± 4.90 dB (p < .05) and 5.18 ± 1.28 dB (p < .05), respectively. The edge, morphostructure, inhomogeneous hyper-enhanced distribution, and ring-like perfusion features in intratumoral microvessel were accurately distinguished and highlighted by the focal PPIs. The developed focal PPI can assist clinicians in making confirmed diagnoses and in providing appropriate therapeutic strategies for liver tumor. Copyright © 2017 Elsevier B.V. All rights reserved.
High-Pressure Measurements of Temperature and CO2 Concentration Using Tunable Diode Lasers at 2 μm.
Cai, Tingdong; Gao, Guangzhen; Wang, Minrui; Wang, Guishi; Liu, Ying; Gao, Xiaoming
2016-03-01
A sensor for simultaneous measurements of temperature and carbon dioxide (CO2) concentration at elevated pressure is developed using tunable diode lasers at 2 µm. Based on some selection rules, a CO2 line pair at 5006.140 and 5010.725 cm(-1) is selected for the TDL sensor. In order to ensure the accuracy and rapidity of the sensor, a quasi-fixed-wavelength WMS is employed. Normalization of the 2f signal with the 1f signal magnitude is used to remove the need for calibration and correct for transmission variation due to beam steering, mechanical misalignments, soot, and windows fouling. Temperatures are obtained from comparison of the background-subtracted 1f-normalized WMS-2f signals ratio and a 1f-normalized WMS-2f peak values ratio model. CO2 concentration is inferred from the 1f-normalized WMS-2f peak values of the CO2 transition at 5006.140 cm(-1). Measurements of temperature and CO2 concentration are carried out in static cell experiments (P = 1-10 atm, T = 500-1200 K) to validate the accuracy and ability of the sensor. The results show that accuracy of the sensor for temperature and CO2 concentration are 1.66% temperature and 3.1%, respectively. All the measurements show the potential utility of the sensor for combustion diagnose at elevated pressure. © The Author(s) 2016.
Ontology Matching with Semantic Verification.
Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R
2009-09-01
ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.
Improved and Robust Detection of Cell Nuclei from Four Dimensional Fluorescence Images
Bashar, Md. Khayrul; Yamagata, Kazuo; Kobayashi, Tetsuya J.
2014-01-01
Segmentation-free direct methods are quite efficient for automated nuclei extraction from high dimensional images. A few such methods do exist but most of them do not ensure algorithmic robustness to parameter and noise variations. In this research, we propose a method based on multiscale adaptive filtering for efficient and robust detection of nuclei centroids from four dimensional (4D) fluorescence images. A temporal feedback mechanism is employed between the enhancement and the initial detection steps of a typical direct method. We estimate the minimum and maximum nuclei diameters from the previous frame and feed back them as filter lengths for multiscale enhancement of the current frame. A radial intensity-gradient function is optimized at positions of initial centroids to estimate all nuclei diameters. This procedure continues for processing subsequent images in the sequence. Above mechanism thus ensures proper enhancement by automated estimation of major parameters. This brings robustness and safeguards the system against additive noises and effects from wrong parameters. Later, the method and its single-scale variant are simplified for further reduction of parameters. The proposed method is then extended for nuclei volume segmentation. The same optimization technique is applied to final centroid positions of the enhanced image and the estimated diameters are projected onto the binary candidate regions to segment nuclei volumes.Our method is finally integrated with a simple sequential tracking approach to establish nuclear trajectories in the 4D space. Experimental evaluations with five image-sequences (each having 271 3D sequential images) corresponding to five different mouse embryos show promising performances of our methods in terms of nuclear detection, segmentation, and tracking. A detail analysis with a sub-sequence of 101 3D images from an embryo reveals that the proposed method can improve the nuclei detection accuracy by 9 over the previous methods, which used inappropriate large valued parameters. Results also confirm that the proposed method and its variants achieve high detection accuracies ( 98 mean F-measure) irrespective of the large variations of filter parameters and noise levels. PMID:25020042
Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.; Polly, B.
2011-12-01
This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less
Treatment of ice cover and other thin elastic layers with the parabolic equation method.
Collins, Michael D
2015-03-01
The parabolic equation method is extended to handle problems involving ice cover and other thin elastic layers. Parabolic equation solutions are based on rational approximations that are designed using accuracy constraints to ensure that the propagating modes are handled properly and stability constrains to ensure that the non-propagating modes are annihilated. The non-propagating modes are especially problematic for problems involving thin elastic layers. It is demonstrated that stable results may be obtained for such problems by using rotated rational approximations [Milinazzo, Zala, and Brooke, J. Acoust. Soc. Am. 101, 760-766 (1997)] and generalizations of these approximations. The approach is applied to problems involving ice cover with variable thickness and sediment layers that taper to zero thickness.
NASA Astrophysics Data System (ADS)
Tupas, M. E. A.; Dasallas, J. A.; Jiao, B. J. D.; Magallon, B. J. P.; Sempio, J. N. H.; Ramos, M. K. F.; Aranas, R. K. D.; Tamondong, A. M.
2017-10-01
The FAST-SIFT corner detector and descriptor extractor combination was used to automatically georeference DIWATA-1 Spaceborne Multispectral Imager images. Features from the Fast Accelerated Segment Test (FAST) algorithm detects corners or keypoints in an image, and these robustly detected keypoints have well-defined positions. Descriptors were computed using Scale-Invariant Feature Transform (SIFT) extractor. FAST-SIFT method effectively SMI same-subscene images detected by the NIR sensor. The method was also tested in stitching NIR images with varying subscene swept by the camera. The slave images were matched to the master image. The keypoints served as the ground control points. Random sample consensus was used to eliminate fall-out matches and ensure accuracy of the feature points from which the transformation parameters were derived. Keypoints are matched based on their descriptor vector. Nearest-neighbor matching is employed based on a metric distance between the descriptors. The metrics include Euclidean and city block, among others. Rough matching outputs not only the correct matches but also the faulty matches. A previous work in automatic georeferencing incorporates a geometric restriction. In this work, we applied a simplified version of the learning method. RANSAC was used to eliminate fall-out matches and ensure accuracy of the feature points. This method identifies if a point fits the transformation function and returns inlier matches. The transformation matrix was solved by Affine, Projective, and Polynomial models. The accuracy of the automatic georeferencing method were determined by calculating the RMSE of interest points, selected randomly, between the master image and transformed slave image.
Johnson, Jeremy L; O'Neal, Katherine S; Pack, Christopher C; Carter, Sandra M
2017-05-01
An important factor in controlling diabetes is self-monitoring of blood glucose. Manufacturers of glucose meters recommend routine use of control solution to ensure accuracy. Previous studies have demonstrated that glucose meters vary in accuracy and that patients are not using control solution as recommended. The purpose of this study is to identify potential barriers to control solution use from multiple perspectives including patient, pharmacist, and provider. This study was a prospective, observational survey design. First, 25 randomly selected chain and independent pharmacies in the Tulsa metropolitan area were audited for control solution accessibility. These pharmacies were then used to survey pharmacists, via telephone, regarding control solution inventory and perception of importance of use. Next, providers were electronically surveyed on their routine practice recommendations, while 60 patients with diabetes were randomly selected for telephone survey on use and perceptions of control solution. Twenty-five pharmacies were audited and 23 pharmacists, 60 patients, and 29 providers were surveyed. Only 39% of pharmacies stated they supplied control solution, however, only 1 pharmacy visibly stocked it. The only patient factor that appeared to have an impact on control solution usage was having type 1 versus type 2 diabetes (38% vs 15%). Providers are aware of what control solution is (62%), but only half felt it should be routine practice with 44% of those never recommending it. This study raises awareness for the need to educate patients, providers, and pharmacists about use of control solution to ensure glucose meter accuracy.
Evaluating the accuracy of orthophotos and 3D models from UAV photogrammetry
NASA Astrophysics Data System (ADS)
Julge, Kalev; Ellmann, Artu
2015-04-01
Rapid development of unmanned aerial vehicles (UAV) in recent years has made their use for various applications more feasible. This contribution evaluates the accuracy and quality of different UAV remote sensing products (i.e. orthorectified image, point cloud and 3D model). Two different autonomous fixed wing UAV systems were used to collect the aerial photographs. One is a mass-produced commercial UAV system, the other is a similar state-of-the-art UAV system. Three different study areas with varying sizes and characteristics (including urban areas, forests, fields, etc.) were surveyed. The UAV point clouds, 3D models and orthophotos were generated with three different commercial and free-ware software. The performance of each of these was evaluated. The effect of flying height on the accuracy of the results was explored, as well as the optimum number and placement of ground control points. Also the achieved results, when the only georeferencing data originates from the UAV system's on-board GNSS and inertial measurement unit, are investigated. Problems regarding the alignment of certain types of aerial photos (e.g. captured over forested areas) are discussed. The quality and accuracy of UAV photogrammetry products are evaluated by comparing them with control measurements made with GNSS-measurements on the ground, as well as high-resolution airborne laser scanning data and other available orthophotos (e.g. those acquired for large scale national mapping). Vertical comparisons are made on surfaces that have remained unchanged in all campaigns, e.g. paved roads. Planar comparisons are performed by control surveys of objects that are clearly identifiable on orthophotos. The statistics of these differences are used to evaluate the accuracy of UAV remote sensing. Some recommendations are given on how to conduct UAV mapping campaigns cost-effectively and with minimal time-consumption while still ensuring the quality and accuracy of the UAV data products. Also the benefits and drawbacks of UAV remote sensing compared to more traditional methods (e.g. national mapping from airplanes or direct measurements on the ground with GNSS devices or total stations) are outlined.
ERIC Educational Resources Information Center
Çikirikçi Demirtasli, Nükhet; Ulutas, Seher
2015-01-01
Problem Statement: Item bias occurs when individuals from different groups (different gender, cultural background, etc.) have different probabilities of responding correctly to a test item despite having the same skill levels. It is important that tests or items do not have bias in order to ensure the accuracy of decisions taken according to test…
Ground-Based Calibration Of A Microwave Landing System
NASA Technical Reports Server (NTRS)
Kiriazes, John J.; Scott, Marshall M., Jr.; Willis, Alfred D.; Erdogan, Temel; Reyes, Rolando
1996-01-01
System of microwave instrumentation and data-processing equipment developed to enable ground-based calibration of microwave scanning-beam landing system (MSBLS) at distances of about 500 to 1,000 ft from MSBLS transmitting antenna. Ensures accuracy of MSBLS near touchdown point, without having to resort to expense and complex logistics of aircraft-based testing. Modified versions prove useful in calibrating aircraft instrument landing systems.
The management of external marketing communication instruments in health care services.
Bobocea, L; Spiridon, St; Petrescu, L; Gheorghe, C M; Purcarea, V L
2016-01-01
In order to become known and attract consumers, a health care organization has to develop suitable external communication campaigns. Consequently, management instruments are employed to effectively evaluate the success of a campaign. The BCG Matrix, SWOT analysis and the Gantt Diagram were used in this paper to ensure the consistency and accuracy of the external communication process at an empirical level.
The management of external marketing communication instruments in health care services
Bobocea, L; Spiridon, St; Petrescu, L; Gheorghe, CM; Purcarea, VL
2016-01-01
In order to become known and attract consumers, a health care organization has to develop suitable external communication campaigns. Consequently, management instruments are employed to effectively evaluate the success of a campaign. The BCG Matrix, SWOT analysis and the Gantt Diagram were used in this paper to ensure the consistency and accuracy of the external communication process at an empirical level. PMID:27453742
ERIC Educational Resources Information Center
Compton, Donald L.; Gilbert, Jennifer K.; Jenkins, Joseph R.; Fuchs, Douglas; Fuchs, Lynn S.; Cho, Eunsoo; Barquero, Laura A.; Bouton, Bobette
2012-01-01
Response-to-intervention (RTI) approaches to disability identification are meant to put an end to the so-called wait-to-fail requirement associated with IQ discrepancy. However, in an unfortunate irony, there is a group of children who wait to fail in RTI frameworks. That is, they must fail both general classroom instruction (Tier 1) and…
Griendling, Kathy K.; Touyz, Rhian M.; Zweier, Jay L.; Dikalov, Sergey; Chilian, William; Chen, Yeong-Renn; Harrison, David G.; Bhatnagar, Aruni
2017-01-01
Reactive oxygen species and reactive nitrogen species are biological molecules that play important roles in cardiovascular physiology and contribute to disease initiation, progression, and severity. Because of their ephemeral nature and rapid reactivity, these species are difficult to measure directly with high accuracy and precision. In this statement, we review current methods for measuring these species and the secondary products they generate and suggest approaches for measuring redox status, oxidative stress, and the production of individual reactive oxygen and nitrogen species. We discuss the strengths and limitations of different methods and the relative specificity and suitability of these methods for measuring the concentrations of reactive oxygen and reactive nitrogen species in cells, tissues, and biological fluids. We provide specific guidelines, through expert opinion, for choosing reliable and reproducible assays for different experimental and clinical situations. These guidelines are intended to help investigators and clinical researchers avoid experimental error and ensure high-quality measurements of these important biological species. PMID:27418630
Satellite Antenna Pointing Procedure Driven by the Ground Service Quality
NASA Astrophysics Data System (ADS)
Yasui, Yoshitsugu
A satellite antenna alignment technique is proposed to ensure terrestrial service quality for users. The antenna bore sight orientation is calculated directly from measured data acquired from general ground receivers, which intercept the communication radio waves from any position on the earth's surface. The method coordinates the satellite pointing parameters with signal strength at the receivers while considering location-specific geographical and antenna radiation characteristics and control accuracy. The theoretical development and its validity are examined in the course of equation derivation. Actual measured data of an existing satellite at the maneuver was applied to the method, and the capability was demonstrated and verified. With the wide diversity of satellite usage, such as for mobile communications, temporary network deployment or post-launch positioning accommodations, the proposed method provides a direct evaluation of satellite communication performance at the service level, in conjunction with using high frequency spot beam antennas, which are highly susceptible to pointing gain. This can facilitate swift and flexible satellite service planning and deployment for operators.
Design of a new type synchronous focusing mechanism
NASA Astrophysics Data System (ADS)
Zhang, Jintao; Tan, Ruijun; Chen, Zhou; Zhang, Yongqi; Fu, Panlong; Qu, Yachen
2018-05-01
Aiming at the dual channel telescopic imaging system composed of infrared imaging system, low-light-level imaging system and image fusion module, In the fusion of low-light-level images and infrared images, it is obvious that using clear source images is easier to obtain high definition fused images. When the target is imaged at 15m to infinity, focusing is needed to ensure the imaging quality of the dual channel imaging system; therefore, a new type of synchronous focusing mechanism is designed. The synchronous focusing mechanism realizes the focusing function through the synchronous translational imaging devices, mainly including the structure of the screw rod nut, the shaft hole coordination structure and the spring steel ball eliminating clearance structure, etc. Starting from the synchronous focusing function of two imaging devices, the structure characteristics of the synchronous focusing mechanism are introduced in detail, and the focusing range is analyzed. The experimental results show that the synchronous focusing mechanism has the advantages of ingenious design, high focusing accuracy and stable and reliable operation.
Failure and penetration response of borosilicate glass during short-rod impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, C. E. Jr.; Orphal, D. L.; Behner, Th.
2007-12-12
The failure characterization of brittle materials like glass is of fundamental importance in describing the penetration resistance against projectiles. A critical question is whether this failure front remains 'steady' after the driving stress is removed. A test series with short gold rods (D = 1 mm, L/D{approx_equal}5-11) impacting borosilicate glass at {approx}1 to 2 km/s was carried out to investigate this question. The reverse ballistic method was used for the experiments, and the impact and penetration process was observed simultaneously with five flash X-rays and a 16-frame high-speed optical camera. Very high measurement accuracy was established to ensure reliable results.more » Results show that the failure front induced by rod impact and penetration does arrest (ceases to propagate) after the rod is totally eroded inside the glass. The impact of a second rod after a short time delay reinitiates the failure front at about the same speed.« less
Huang, Shuai; Li, Jing; Ye, Jieping; Fleisher, Adam; Chen, Kewei; Wu, Teresa; Reiman, Eric
2013-06-01
Structure learning of Bayesian Networks (BNs) is an important topic in machine learning. Driven by modern applications in genetics and brain sciences, accurate and efficient learning of large-scale BN structures from high-dimensional data becomes a challenging problem. To tackle this challenge, we propose a Sparse Bayesian Network (SBN) structure learning algorithm that employs a novel formulation involving one L1-norm penalty term to impose sparsity and another penalty term to ensure that the learned BN is a Directed Acyclic Graph--a required property of BNs. Through both theoretical analysis and extensive experiments on 11 moderate and large benchmark networks with various sample sizes, we show that SBN leads to improved learning accuracy, scalability, and efficiency as compared with 10 existing popular BN learning algorithms. We apply SBN to a real-world application of brain connectivity modeling for Alzheimer's disease (AD) and reveal findings that could lead to advancements in AD research.
Fast rail corrugation detection based on texture filtering
NASA Astrophysics Data System (ADS)
Xiao, Jie; Lu, Kaixia
2018-02-01
The condition detection of rails in high-speed railway is one of the important means to ensure the safety of railway transportation. In order to replace the traditional manual inspection, save manpower and material resources, and improve the detection speed and accuracy, it is of great significance to develop a machine vision system for locating and identifying defects on rails automatically. Rail defects exhibit different properties and are divided into various categories related to the type and position of flaws on the rail. Several kinds of interrelated factors cause rail defects such as type of rail, construction conditions, and speed and/or frequency of trains using the rail. Rail corrugation is a particular kind of defects that produce an undulatory deformation on the rail heads. In high speed train, the corrugation induces harmful vibrations on wheels and its components and reduces the lifetime of rails. This type of defects should be detected to avoid rail fractures. In this paper, a novel method for fast rail corrugation detection based on texture filtering was proposed.
Huang, Shuai; Li, Jing; Ye, Jieping; Fleisher, Adam; Chen, Kewei; Wu, Teresa; Reiman, Eric
2014-01-01
Structure learning of Bayesian Networks (BNs) is an important topic in machine learning. Driven by modern applications in genetics and brain sciences, accurate and efficient learning of large-scale BN structures from high-dimensional data becomes a challenging problem. To tackle this challenge, we propose a Sparse Bayesian Network (SBN) structure learning algorithm that employs a novel formulation involving one L1-norm penalty term to impose sparsity and another penalty term to ensure that the learned BN is a Directed Acyclic Graph (DAG)—a required property of BNs. Through both theoretical analysis and extensive experiments on 11 moderate and large benchmark networks with various sample sizes, we show that SBN leads to improved learning accuracy, scalability, and efficiency as compared with 10 existing popular BN learning algorithms. We apply SBN to a real-world application of brain connectivity modeling for Alzheimer’s disease (AD) and reveal findings that could lead to advancements in AD research. PMID:22665720
Lugauer, Felix; Wetzl, Jens; Forman, Christoph; Schneider, Manuel; Kiefer, Berthold; Hornegger, Joachim; Nickel, Dominik; Maier, Andreas
2018-06-01
Our aim was to develop and validate a 3D Cartesian Look-Locker [Formula: see text] mapping technique that achieves high accuracy and whole-liver coverage within a single breath-hold. The proposed method combines sparse Cartesian sampling based on a spatiotemporally incoherent Poisson pattern and k-space segmentation, dedicated for high-temporal-resolution imaging. This combination allows capturing tissue with short relaxation times with volumetric coverage. A joint reconstruction of the 3D + inversion time (TI) data via compressed sensing exploits the spatiotemporal sparsity and ensures consistent quality for the subsequent multistep [Formula: see text] mapping. Data from the National Institute of Standards and Technology (NIST) phantom and 11 volunteers, along with reference 2D Look-Locker acquisitions, are used for validation. 2D and 3D methods are compared based on [Formula: see text] values in different abdominal tissues at 1.5 and 3 T. [Formula: see text] maps obtained from the proposed 3D method compare favorably with those from the 2D reference and additionally allow for reformatting or volumetric analysis. Excellent agreement is shown in phantom [bias[Formula: see text] < 2%, bias[Formula: see text] < 5% for (120; 2000) ms] and volunteer data (3D and 2D deviation < 4% for liver, muscle, and spleen) for clinically acceptable scan (20 s) and reconstruction times (< 4 min). Whole-liver [Formula: see text] mapping with high accuracy and precision is feasible in one breath-hold using spatiotemporally incoherent, sparse 3D Cartesian sampling.
Possibilities of CT Scanning as Analysis Method in Laser Additive Manufacturing
NASA Astrophysics Data System (ADS)
Karme, Aleksis; Kallonen, Aki; Matilainen, Ville-Pekka; Piili, Heidi; Salminen, Antti
Laser additive manufacturing is an established and constantly developing technique. Structural assessment should be a key component to ensure directed evolution towards higher level of manufacturing. The macroscopic properties of metallic structures are determined by their internal microscopic features, which are difficult to assess using conventional surface measuring methodologies. X-ray microtomography (CT) is a promising technique for three-dimensional non-destructive probing of internal composition and build of various materials. Aim of this study is to define the possibilities of using CT scanning as quality control method in LAM fabricated parts. Since the parts fabricated with LAM are very often used in high quality and accuracy demanding applications in various industries such as medical and aerospace, it is important to be able to define the accuracy of the build parts. The tubular stainless steel test specimens were 3D modelled, manufactured with a modified research AM equipment and imaged after manufacturing with a high-power, high-resolution CT scanner. 3D properties, such as surface texture and the amount and distribution of internal pores, were also evaluated in this study. Surface roughness was higher on the interior wall of the tube, and deviation from the model was systematically directed towards the central axis. Pore distribution showed clear organization and divided into two populations; one following the polygon model seams along both rims, and the other being associated with the concentric and equidistant movement path of the laser. Assessment of samples can enhance the fabrication by guiding the improvement of both modelling and manufacturing process.
Pilcher, Janine; Shirtcliffe, Philippa; Patel, Mitesh; McKinstry, Steve; Cripps, Terrianne; Weatherall, Mark; Beasley, Richard
2015-01-01
Electronic monitoring of inhaled asthma therapy is suggested as the 'gold standard' for measuring patterns of medication use in clinical trials. The SmartTurbo (Adherium (NZ) Ltd, Auckland, New Zealand) is an electronic monitor for use with a turbuhaler device (AstraZeneca, UK). The aim of this study was to determine the accuracy of the SmartTurbo in recording Symbicort actuations over a 12-week period of use. Twenty SmartTurbo monitors were attached to the base of 20 Symbicort turbuhalers. Bench testing in a research facility was undertaken on days 0, 5, 6, 7, 8, 9, 14, 21, 28, 56 and 84. Patterns of 'low-use' (2 sets of 2 actuations on the same day) and 'high-use' (2 sets of 8 actuations on the same day) were performed. The date and time of actuations were recorded in a paper diary and compared with data uploaded from the SmartTurbo monitors. 2800 actuations were performed. Monitor sensitivity was 99.9% with a lower 97.5% confidence bound of 99.6%. The positive predictive value was 99.9% with a 97.5% lower confidence bound of 99.7%. Accuracy was not affected by whether the pattern of inhaler use was low or high, or whether there was a delay in uploading the actuation data. The SmartTurbo monitor is highly accurate in recording and retaining electronic data in this 12-week bench study. It can be recommended for use in clinical trial settings, in which quality control systems are incorporated into study protocols to ensure accurate data acquisition.
Dimitriadis, Stavros I; Salis, Christos; Linden, David
2018-04-01
Limitations of the manual scoring of polysomnograms, which include data from electroencephalogram (EEG), electro-oculogram (EOG), electrocardiogram (ECG) and electromyogram (EMG) channels have long been recognized. Manual staging is resource intensive and time consuming, and thus considerable effort must be spent to ensure inter-rater reliability. As a result, there is a great interest in techniques based on signal processing and machine learning for a completely Automatic Sleep Stage Classification (ASSC). In this paper, we present a single-EEG-sensor ASSC technique based on the dynamic reconfiguration of different aspects of cross-frequency coupling (CFC) estimated between predefined frequency pairs over 5 s epoch lengths. The proposed analytic scheme is demonstrated using the PhysioNet Sleep European Data Format (EDF) Database with repeat recordings from 20 healthy young adults. We validate our methodology in a second sleep dataset. We achieved very high classification sensitivity, specificity and accuracy of 96.2 ± 2.2%, 94.2 ± 2.3%, and 94.4 ± 2.2% across 20 folds, respectively, and also a high mean F1 score (92%, range 90-94%) when a multi-class Naive Bayes classifier was applied. High classification performance has been achieved also in the second sleep dataset. Our method outperformed the accuracy of previous studies not only on different datasets but also on the same database. Single-sensor ASSC makes the entire methodology appropriate for longitudinal monitoring using wearable EEG in real-world and laboratory-oriented environments. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Radiometric and geometric assessment of data from the RapidEye constellation of satellites
Chander, Gyanesh; Haque, Md. Obaidul; Sampath, Aparajithan; Brunn, A.; Trosset, G.; Hoffmann, D.; Roloff, S.; Thiele, M.; Anderson, C.
2013-01-01
To monitor land surface processes over a wide range of temporal and spatial scales, it is critical to have coordinated observations of the Earth's surface using imagery acquired from multiple spaceborne imaging sensors. The RapidEye (RE) satellite constellation acquires high-resolution satellite images covering the entire globe within a very short period of time by sensors identical in construction and cross-calibrated to each other. To evaluate the RE high-resolution Multi-spectral Imager (MSI) sensor capabilities, a cross-comparison between the RE constellation of sensors was performed first using image statistics based on large common areas observed over pseudo-invariant calibration sites (PICS) by the sensors and, second, by comparing the on-orbit radiometric calibration temporal trending over a large number of calibration sites. For any spectral band, the individual responses measured by the five satellites of the RE constellation were found to differ <2–3% from the average constellation response depending on the method used for evaluation. Geometric assessment was also performed to study the positional accuracy and relative band-to-band (B2B) alignment of the image data sets. The position accuracy was assessed by comparing the RE imagery against high-resolution aerial imagery, while the B2B characterization was performed by registering each band against every other band to ensure that the proper band alignment is provided for an image product. The B2B results indicate that the internal alignments of these five RE bands are in agreement, with bands typically registered to within 0.25 pixels of each other or better.
Yoshizawa, Shin; Matsuura, Keiko; Takagi, Ryo; Yamamoto, Mariko; Umemura, Shin-Ichiro
2016-01-01
A noninvasive technique to monitor thermal lesion formation is necessary to ensure the accuracy and safety of high-intensity focused ultrasound (HIFU) treatment. The purpose of this study is to ultrasonically detect the tissue change due to thermal coagulation in the HIFU treatment enhanced by cavitation microbubbles. An ultrasound imaging probe transmitted plane waves at a center frequency of 4.5 MHz. Ultrasonic radio-frequency (RF) echo signals during HIFU exposure at a frequency of 1.2 MHz were acquired. Cross-correlation coefficients were calculated between in-phase and quadrature (IQ) data of two B-mode images with an interval time of 50 and 500 ms for the estimation of the region of cavitation and coagulation, respectively. Pathological examination of the coagulated tissue was also performed to compare with the corresponding ultrasonically detected coagulation region. The distribution of minimum hold cross-correlation coefficient between two sets of IQ data with 50-ms intervals was compared with a pulse inversion (PI) image. The regions with low cross-correlation coefficients approximately corresponded to those with high brightness in the PI image. The regions with low cross-correlation coefficients in 500-ms intervals showed a good agreement with those with significant change in histology. The results show that the regions of coagulation and cavitation could be ultrasonically detected as those with low cross-correlation coefficients between RF frames with certain intervals. This method will contribute to improve the safety and accuracy of the HIFU treatment enhanced by cavitation microbubbles.
Film and digital periapical radiographs for the measurement of apical root shortening.
El-Angbawi, Ahmed M F; McIntyre, Grant T; Bearn, David R; Thomson, Donald J
2012-12-01
The aim of this study was to compare the accuracy and agreement of scanned film and digital periapical radiographs for the measurement of apical root shortening. Twenty-four film and digital [phosphor plate sensor (PPS)] periapical radiographs were taken using the long-cone paralleling technique for six extracted teeth before and after 1mm of apical root trimming. All teeth were mounted using a typodont and the radiographs were recorded using a film holder and polysiloxane occlusal index for each tooth to ensure standardization during the different radiographic exposures. The film radiographs were scanned and the tooth length measurements for the scanned film and digital (PPS) images were calculated using Image-J-Link 1.4 software (http://rebweb.nih.gov/ij/index.html) for the two groups. The accuracy and agreement among the tooth length measurements from each group and the true tooth length measurements were calculated using intra-class correlation (ICC) tests and Bland and Altman plots. A high level of agreement was found between the true tooth length measurements and the scanned film measurements (ICC=0.979, limit of agreement 0.579 to -0.565) and the digital (PPS) radiograph measurements (ICC= 0.979, limit of agreement 0.596 to -0.763). Moreover, a high level of agreement was found between the scanned film and digital (PPS) radiographs for the measurement of tooth length ICC=0.991, limit of agreement 0.411-0.231. Film and digital (PPS) periapical radiographs are accurate methods for measuring apical root shortening with a high level of agreement. Key words:Root shortening, measurement, periapical radiographs, film, digital.
Customer and household matching: resolving entity identity in data warehouses
NASA Astrophysics Data System (ADS)
Berndt, Donald J.; Satterfield, Ronald K.
2000-04-01
The data preparation and cleansing tasks necessary to ensure high quality data are among the most difficult challenges faced in data warehousing and data mining projects. The extraction of source data, transformation into new forms, and loading into a data warehouse environment are all time consuming tasks that can be supported by methodologies and tools. This paper focuses on the problem of record linkage or entity matching, tasks that can be very important in providing high quality data. Merging two or more large databases into a single integrated system is a difficult problem in many industries, especially in the wake of acquisitions. For example, managing customer lists can be challenging when duplicate entries, data entry problems, and changing information conspire to make data quality an elusive target. Common tasks with regard to customer lists include customer matching to reduce duplicate entries and household matching to group customers. These often O(n2) problems can consume significant resources, both in computing infrastructure and human oversight, and the goal of high accuracy in the final integrated database can be difficult to assure. This paper distinguishes between attribute corruption and entity corruption, discussing the various impacts on quality. A metajoin operator is proposed and used to organize past and current entity matching techniques. Finally, a logistic regression approach to implementing the metajoin operator is discussed and illustrated with an example. The metajoin can be used to determine whether two records match, don't match, or require further evaluation by human experts. Properly implemented, the metajoin operator could allow the integration of individual databases with greater accuracy and lower cost.
NASA Astrophysics Data System (ADS)
Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily
2017-10-01
Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.
Absorbance and fluorometric sensing with capillary wells microplates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Han Yen; Cheong, Brandon Huey-Ping; Neild, Adrian
2010-12-15
Detection and readout from small volume assays in microplates are a challenge. The capillary wells microplate approach [Ng et al., Appl. Phys. Lett. 93, 174105 (2008)] offers strong advantages in small liquid volume management. An adapted design is described and shown here to be able to detect, in a nonimaging manner, fluorescence and absorbance assays minus the error often associated with meniscus forming at the air-liquid interface. The presence of bubbles in liquid samples residing in microplate wells can cause inaccuracies. Pipetting errors, if not adequately managed, can result in misleading data and wrong interpretations of assay results; particularly inmore » the context of high throughput screening. We show that the adapted design is also able to detect for bubbles and pipetting errors during actual assay runs to ensure accuracy in screening.« less
Integer programming model for optimizing bus timetable using genetic algorithm
NASA Astrophysics Data System (ADS)
Wihartiko, F. D.; Buono, A.; Silalahi, B. P.
2017-01-01
Bus timetable gave an information for passengers to ensure the availability of bus services. Timetable optimal condition happened when bus trips frequency could adapt and suit with passenger demand. In the peak time, the number of bus trips would be larger than the off-peak time. If the number of bus trips were more frequent than the optimal condition, it would make a high operating cost for bus operator. Conversely, if the number of trip was less than optimal condition, it would make a bad quality service for passengers. In this paper, the bus timetabling problem would be solved by integer programming model with modified genetic algorithm. Modification was placed in the chromosomes design, initial population recovery technique, chromosomes reconstruction and chromosomes extermination on specific generation. The result of this model gave the optimal solution with accuracy 99.1%.
The information capacity of hypercycles.
Silvestre, Daniel A M M; Fontanari, José F
2008-10-21
Hypercycles are information integration systems which are thought to overcome the information crisis of prebiotic evolution by ensuring the coexistence of several short templates. For imperfect template replication, we derive a simple expression for the maximum number of distinct templates n(m) that can coexist in a hypercycle and show that it is a decreasing function of the length L of the templates. In the case of high replication accuracy we find that the product n(m)L tends to a constant value, limiting thus the information content of the hypercycle. Template coexistence is achieved either as a stationary equilibrium (stable fixed point) or a stable periodic orbit in which the total concentration of functional templates is nonzero. For the hypercycle system studied here we find numerical evidence that the existence of an unstable fixed point is a necessary condition for the presence of periodic orbits.
Development of an ultrasonic weld inspection system based on image processing and neural networks
NASA Astrophysics Data System (ADS)
Roca Barceló, Fernando; Jaén del Hierro, Pedro; Ribes Llario, Fran; Real Herráiz, Julia
2018-04-01
Several types of discontinuities and defects may be present on a weld, thus leading to a considerable reduction of its resistance. Therefore, ensuring a high welding quality and reliability has become a matter of key importance for many construction and industrial activities. Among the non-destructive weld testing and inspection techniques, the time-of-flight diffraction (TOFD) arises as a very safe (no ionising radiation), precise, reliable and versatile practice. However, this technique presents a relevant drawback, associated to the appearance of speckle noise that should be addressed. In this regard, this paper presents a new, intelligent and automatic method for weld inspection and analysis, based on TOFD, image processing and neural networks. The developed system is capable of detecting weld defects and imperfections with accuracy, and classify them into different categories.
Cross-Coupled Control for All-Terrain Rovers
Reina, Giulio
2013-01-01
Mobile robots are increasingly being used in challenging outdoor environments for applications that include construction, mining, agriculture, military and planetary exploration. In order to accomplish the planned task, it is critical that the motion control system ensure accuracy and robustness. The achievement of high performance on rough terrain is tightly connected with the minimization of vehicle-terrain dynamics effects such as slipping and skidding. This paper presents a cross-coupled controller for a 4-wheel-drive/4-wheel-steer robot, which optimizes the wheel motors' control algorithm to reduce synchronization errors that would otherwise result in wheel slip with conventional controllers. Experimental results, obtained with an all-terrain rover operating on agricultural terrain, are presented to validate the system. It is shown that the proposed approach is effective in reducing slippage and vehicle posture errors. PMID:23299625
NASA Technical Reports Server (NTRS)
Crisp, David
2011-01-01
Space-based remote sensing observations hold substantial promise for future long-term monitoring of CO2 and other greenhouse gases. The principal advantages of space based measurements include: (1) Spatial coverage (especially over oceans and tropical land) (2) Sampling density (needed to resolve CO2 weather). The principal challenge is the need for high precision To reach their full potential, space based CO2 measurements must be validated against surface measurements to ensure their accuracy. The TCCON network is providing the transfer standard There is a need for a long-term vision to establish and address community priorities (1) Must incorporate ground, air, space-based assets and models (2) Must balance calls for new observations with need to maintain climate data records.
Tick, David; Satici, Aykut C; Shen, Jinglin; Gans, Nicholas
2013-08-01
This paper presents a novel navigation and control system for autonomous mobile robots that includes path planning, localization, and control. A unique vision-based pose and velocity estimation scheme utilizing both the continuous and discrete forms of the Euclidean homography matrix is fused with inertial and optical encoder measurements to estimate the pose, orientation, and velocity of the robot and ensure accurate localization and control signals. A depth estimation system is integrated in order to overcome the loss of scale inherent in vision-based estimation. A path following control system is introduced that is capable of guiding the robot along a designated curve. Stability analysis is provided for the control system and experimental results are presented that prove the combined localization and control system performs with high accuracy.
Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shuhaimi, Alif Imran Mohd
Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results,more » two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software.« less
Very Long Baseline Interferometry: Dependencies on Frequency Stability
NASA Astrophysics Data System (ADS)
Nothnagel, Axel; Nilsson, Tobias; Schuh, Harald
2018-04-01
Very Long Baseline Interferometry (VLBI) is a differential technique observing radiation of compact extra-galactic radio sources with pairs of radio telescopes. For these observations, the frequency standards at the telescopes need to have very high stability. In this article we discuss why this is, and we investigate exactly how precise the frequency standards need to be. Four areas where good clock performance is needed are considered: coherence, geodetic parameter estimation, correlator synchronization, and UT1 determination. We show that in order to ensure the highest accuracy of VLBI, stability similar to that of a hydrogen maser is needed for time-scales up to a few hours. In the article, we are considering both traditional VLBI where extra-galactic radio sources are observed, as well as observation of man-made artificial radio sources emitted by satellites or spacecrafts.
Increasing leaf vein density by mutagenesis: laying the foundations for C4 rice.
Feldman, Aryo B; Murchie, Erik H; Leung, Hei; Baraoidan, Marietta; Coe, Robert; Yu, Su-May; Lo, Shuen-Fang; Quick, William P
2014-01-01
A high leaf vein density is both an essential feature of C4 photosynthesis and a foundation trait to C4 evolution, ensuring the optimal proportion and proximity of mesophyll and bundle sheath cells for permitting the rapid exchange of photosynthates. Two rice mutant populations, a deletion mutant library with a cv. IR64 background (12,470 lines) and a T-DNA insertion mutant library with a cv. Tainung 67 background (10,830 lines), were screened for increases in vein density. A high throughput method with handheld microscopes was developed and its accuracy was supported by more rigorous microscopy analysis. Eight lines with significantly increased leaf vein densities were identified to be used as genetic stock for the global C4 Rice Consortium. The candidate population was shown to include both shared and independent mutations and so more than one gene controlled the high vein density phenotype. The high vein density trait was found to be linked to a narrow leaf width trait but the linkage was incomplete. The more genetically robust narrow leaf width trait was proposed to be used as a reliable phenotypic marker for finding high vein density variants in rice in future screens.
NASA Astrophysics Data System (ADS)
Chi, Yuxi; Yu, Liping; Pan, Bing
2018-05-01
A low-cost, portable, robust and high-resolution single-camera stereo-digital image correlation (stereo-DIC) system for accurate surface three-dimensional (3D) shape and deformation measurements is described. This system adopts a single consumer-grade high-resolution digital Single Lens Reflex (SLR) camera and a four-mirror adaptor, rather than two synchronized industrial digital cameras, for stereo image acquisition. In addition, monochromatic blue light illumination and coupled bandpass filter imaging are integrated to ensure the robustness of the system against ambient light variations. In contrast to conventional binocular stereo-DIC systems, the developed pseudo-stereo-DIC system offers the advantages of low cost, portability, robustness against ambient light variations, and high resolution. The accuracy and precision of the developed single SLR camera-based stereo-DIC system were validated by measuring the 3D shape of a stationary sphere along with in-plane and out-of-plane displacements of a translated planar plate. Application of the established system to thermal deformation measurement of an alumina ceramic plate and a stainless-steel plate subjected to radiation heating was also demonstrated.
Shunt flow evaluation in congenital heart disease based on two-dimensional speckle tracking.
Fadnes, Solveig; Nyrnes, Siri Ann; Torp, Hans; Lovstakken, Lasse
2014-10-01
High-frame-rate ultrasound speckle tracking was used for quantification of peak velocity in shunt flows resulting from septal defects in congenital heart disease. In a duplex acquisition scheme implemented on a research scanner, unfocused transmit beams and full parallel receive beamforming were used to achieve a frame rate of 107 frames/s for full field-of-view flow images with high accuracy, while also ensuring high-quality focused B-mode tissue imaging. The setup was evaluated in vivo for neonates with atrial and ventricular septal defects. The shunt position was automatically tracked in B-mode images and further used in blood speckle tracking to obtain calibrated shunt flow velocities throughout the cardiac cycle. Validation toward color flow imaging and pulsed wave Doppler with manual angle correction indicated that blood speckle tracking could provide accurate estimates of shunt flow velocities. The approach was less biased by clutter filtering compared with color flow imaging and was able to provide velocity estimates beyond the Nyquist range. Possible placements of sample volumes (and angle corrections) for conventional Doppler resulted in a peak shunt velocity variations of 0.49-0.56 m/s for the ventricular septal defect of patient 1 and 0.38-0.58 m/s for the atrial septal defect of patient 2. In comparison, the peak velocities found from speckle tracking were 0.77 and 0.33 m/s for patients 1 and 2, respectively. Results indicated that complex intraventricular flow velocity patterns could be quantified using high-frame-rate speckle tracking of both blood and tissue movement. This could potentially help increase diagnostic accuracy and decrease inter-observer variability when measuring peak velocity in shunt flows. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
New Criteria for Assessing the Accuracy of Blood Glucose Monitors Meeting, October 28, 2011
Walsh, John; Roberts, Ruth; Vigersky, Robert A.; Schwartz, Frank
2012-01-01
Glucose meters (GMs) are routinely used for self-monitoring of blood glucose by patients and for point-of-care glucose monitoring by health care providers in outpatient and inpatient settings. Although widely assumed to be accurate, numerous reports of inaccuracies with resulting morbidity and mortality have been noted. Insulin dosing errors based on inaccurate GMs are most critical. On October 28, 2011, the Diabetes Technology Society invited 45 diabetes technology clinicians who were attending the 2011 Diabetes Technology Meeting to participate in a closed-door meeting entitled New Criteria for Assessing the Accuracy of Blood Glucose Monitors. This report reflects the opinions of most of the attendees of that meeting. The Food and Drug Administration (FDA), the public, and several medical societies are currently in dialogue to establish a new standard for GM accuracy. This update to the FDA standard is driven by improved meter accuracy, technological advances (pumps, bolus calculators, continuous glucose monitors, and insulin pens), reports of hospital and outpatient deaths, consumer complaints about inaccuracy, and research studies showing that several approved GMs failed to meet FDA or International Organization for Standardization standards in post-approval testing. These circumstances mandate a set of new GM standards that appropriately match the GMs’ analytical accuracy to the clinical accuracy required for their intended use, as well as ensuring their ongoing accuracy following approval. The attendees of the New Criteria for Assessing the Accuracy of Blood Glucose Monitors meeting proposed a graduated standard and other methods to improve GM performance, which are discussed in this meeting report. PMID:22538160
New Criteria for Assessing the Accuracy of Blood Glucose Monitors meeting, October 28, 2011.
Walsh, John; Roberts, Ruth; Vigersky, Robert A; Schwartz, Frank
2012-03-01
Glucose meters (GMs) are routinely used for self-monitoring of blood glucose by patients and for point-of-care glucose monitoring by health care providers in outpatient and inpatient settings. Although widely assumed to be accurate, numerous reports of inaccuracies with resulting morbidity and mortality have been noted. Insulin dosing errors based on inaccurate GMs are most critical. On October 28, 2011, the Diabetes Technology Society invited 45 diabetes technology clinicians who were attending the 2011 Diabetes Technology Meeting to participate in a closed-door meeting entitled New Criteria for Assessing the Accuracy of Blood Glucose Monitors. This report reflects the opinions of most of the attendees of that meeting. The Food and Drug Administration (FDA), the public, and several medical societies are currently in dialogue to establish a new standard for GM accuracy. This update to the FDA standard is driven by improved meter accuracy, technological advances (pumps, bolus calculators, continuous glucose monitors, and insulin pens), reports of hospital and outpatient deaths, consumer complaints about inaccuracy, and research studies showing that several approved GMs failed to meet FDA or International Organization for Standardization standards in postapproval testing. These circumstances mandate a set of new GM standards that appropriately match the GMs' analytical accuracy to the clinical accuracy required for their intended use, as well as ensuring their ongoing accuracy following approval. The attendees of the New Criteria for Assessing the Accuracy of Blood Glucose Monitors meeting proposed a graduated standard and other methods to improve GM performance, which are discussed in this meeting report. © 2012 Diabetes Technology Society.
A small-angle x-ray scattering system with a vertical layout.
Wang, Zhen; Chen, Xiaowei; Meng, Lingpu; Cui, Kunpeng; Wu, Lihui; Li, Liangbin
2014-12-01
A small-angle x-ray scattering (SAXS) system with a vertical layout (V-SAXS) has been designed and constructed for in situ detection on nanostructures, which is well suitable for in situ study on self-assembly of nanoparticles at liquid interface and polymer processing. A steel-tower frame on a reinforced basement is built as the supporting skeleton for scattering beam path and detector platform, ensuring the system a high working stability and a high operating accuracy. A micro-focus x-ray source combining parabolic three-dimensional multi-layer mirror and scatteringless collimation system provides a highly parallel beam, which allows us to detect the very small angle range. With a sample-to-detector distance of 7 m, the largest measurable length scale is 420 nm in real space. With a large sample zone, it is possible to install different experimental setups such as film stretching machine, which makes the system perfect to follow the microstructures evolution of materials during processing. The capability of the V-SAXS on in situ study is tested with a drying experiment of a free latex droplet, which confirms our initial design.
A highly accurate boundary integral equation method for surfactant-laden drops in 3D
NASA Astrophysics Data System (ADS)
Sorgentone, Chiara; Tornberg, Anna-Karin
2018-05-01
The presence of surfactants alters the dynamics of viscous drops immersed in an ambient viscous fluid. This is specifically true at small scales, such as in applications of droplet based microfluidics, where the interface dynamics become of increased importance. At such small scales, viscous forces dominate and inertial effects are often negligible. Considering Stokes flow, a numerical method based on a boundary integral formulation is presented for simulating 3D drops covered by an insoluble surfactant. The method is able to simulate drops with different viscosities and close interactions, automatically controlling the time step size and maintaining high accuracy also when substantial drop deformation appears. To achieve this, the drop surfaces as well as the surfactant concentration on each surface are represented by spherical harmonics expansions. A novel reparameterization method is introduced to ensure a high-quality representation of the drops also under deformation, specialized quadrature methods for singular and nearly singular integrals that appear in the formulation are evoked and the adaptive time stepping scheme for the coupled drop and surfactant evolution is designed with a preconditioned implicit treatment of the surfactant diffusion.
Noncontact blood perfusion mapping in clinical applications
NASA Astrophysics Data System (ADS)
Iakovlev, Dmitry; Dwyer, Vincent; Hu, Sijung; Silberschmidt, Vadim
2016-04-01
Non-contact imaging photoplethysmography (iPPG) to detect pulsatile blood microcirculation in tissue has been selected as a successor to low spatial resolution and slow scanning blood perfusion techniques currently employed by clinicians. The proposed iPPG system employs a novel illumination source constructed of multiple high power LEDs with narrow spectral emission, which are temporally modulated and synchronised with a high performance sCMOS sensor. To ensure spectrum stability and prevent thermal wavelength drift due to junction temperature variations, each LED features a custom-designed thermal management system to effectively dissipate generated heat and auto-adjust current flow. The use of a multi-wavelength approach has resulted in simultaneous microvascular perfusion monitoring at various tissue depths, which is an added benefit for specific clinical applications. A synchronous detection algorithm to extract weak photoplethysmographic pulse-waveforms demonstrated robustness and high efficiency when applied to even small regions of 5 mm2. The experimental results showed evidences that the proposed system could achieve noticeable accuracy in blood perfusion monitoring by creating complex amplitude and phase maps for the tissue under examination.
Interactive-rate Motion Planning for Concentric Tube Robots.
Torres, Luis G; Baykal, Cenk; Alterovitz, Ron
2014-05-01
Concentric tube robots may enable new, safer minimally invasive surgical procedures by moving along curved paths to reach difficult-to-reach sites in a patient's anatomy. Operating these devices is challenging due to their complex, unintuitive kinematics and the need to avoid sensitive structures in the anatomy. In this paper, we present a motion planning method that computes collision-free motion plans for concentric tube robots at interactive rates. Our method's high speed enables a user to continuously and freely move the robot's tip while the motion planner ensures that the robot's shaft does not collide with any anatomical obstacles. Our approach uses a highly accurate mechanical model of tube interactions, which is important since small movements of the tip position may require large changes in the shape of the device's shaft. Our motion planner achieves its high speed and accuracy by combining offline precomputation of a collision-free roadmap with online position control. We demonstrate our interactive planner in a simulated neurosurgical scenario where a user guides the robot's tip through the environment while the robot automatically avoids collisions with the anatomical obstacles.
Hunt, Andrew P; Bach, Aaron J E; Borg, David N; Costello, Joseph T; Stewart, Ian B
2017-01-01
An accurate measure of core body temperature is critical for monitoring individuals, groups and teams undertaking physical activity in situations of high heat stress or prolonged cold exposure. This study examined the range in systematic bias of ingestible temperature sensors compared to a certified and traceable reference thermometer. A total of 119 ingestible temperature sensors were immersed in a circulated water bath at five water temperatures (TEMP A: 35.12 ± 0.60°C, TEMP B: 37.33 ± 0.56°C, TEMP C: 39.48 ± 0.73°C, TEMP D: 41.58 ± 0.97°C, and TEMP E: 43.47 ± 1.07°C) along with a certified traceable reference thermometer. Thirteen sensors (10.9%) demonstrated a systematic bias > ±0.1°C, of which 4 (3.3%) were > ± 0.5°C. Limits of agreement (95%) indicated that systematic bias would likely fall in the range of -0.14 to 0.26°C, highlighting that it is possible for temperatures measured between sensors to differ by more than 0.4°C. The proportion of sensors with systematic bias > ±0.1°C (10.9%) confirms that ingestible temperature sensors require correction to ensure their accuracy. An individualized linear correction achieved a mean systematic bias of 0.00°C, and limits of agreement (95%) to 0.00-0.00°C, with 100% of sensors achieving ±0.1°C accuracy. Alternatively, a generalized linear function (Corrected Temperature (°C) = 1.00375 × Sensor Temperature (°C) - 0.205549), produced as the average slope and intercept of a sub-set of 51 sensors and excluding sensors with accuracy outside ±0.5°C, reduced the systematic bias to < ±0.1°C in 98.4% of the remaining sensors ( n = 64). In conclusion, these data show that using an uncalibrated ingestible temperature sensor may provide inaccurate data that still appears to be statistically, physiologically, and clinically meaningful. Correction of sensor temperature to a reference thermometer by linear function eliminates this systematic bias (individualized functions) or ensures systematic bias is within ±0.1°C in 98% of the sensors (generalized function).
Ekins, Kylie; Morphet, Julia
2015-11-01
The Australasian Triage Scale aims to ensure that the triage category allocated, reflects the urgency with which the patient needs medical assistance. This is dependent on triage nurse accuracy in decision making. The Australasian Triage Scale also aims to facilitate triage decision consistency between individuals and organisations. Various studies have explored the accuracy and consistency of triage decisions throughout Australia, yet no studies have specifically focussed on triage decision making in rural health services. Further, no standard has been identified by which accuracy or consistency should be measured. Australian emergency departments are measured against a set of standard performance indicators, including time from triage to patient review, and patient length of stay. There are currently no performance indicators for triage consistency. An online questionnaire was developed to collect demographic data and measure triage accuracy and consistency. The questionnaire utilised previously validated triage scenarios.(1) Triage decision accuracy was measured, and consistency was compared by health site type using Fleiss' kappa. Forty-six triage nurses participated in this study. The accuracy of participants' triage decision-making decreased with each less urgent triage category. Post-graduate qualifications had no bearing on triage accuracy. There was no significant difference in the consistency of decision-making between paediatric and adult scenarios. Overall inter-rater agreement using Fleiss' kappa coefficient, was 0.4. This represents a fair-to-good level of inter-rater agreement. A standard definition of accuracy and consistency in triage nurse decision making is required. Inaccurate triage decisions can result in increased morbidity and mortality. It is recommended that emergency department performance indicator thresholds be utilised as a benchmark for national triage consistency. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
LLNL Location and Detection Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, S C; Harris, D B; Anderson, M L
2003-07-16
We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittauer, K; Yan, G; Lu, B
2014-06-15
Purpose: Optical tracking systems (OTS) are an acceptable alternative to frame-based stereotactic radiotherapy (SRT). However, current surface-based OTS lack the ability to target exclusively rigid/bony anatomical features. We propose a novel marker-based optical tracking goggle system (OTGS) that provides real-time guidance based on the nose/facial bony anatomy. This ongoing study involves the development and characterization of the OTGS for clinical implementation in intracranial stereotactic radiotherapy. Methods: The OTGS consists of eye goggles, a custom thermoplastic nosepiece, and 6 infrared markers pre-attached to the goggles. A phantom and four healthy volunteers were used to evaluate the calibration/registration accuracy, intrafraction accuracy, interfractionmore » reproducibility, and end-to-end accuracy of the OTGS. The performance of the OTGS was compared with that of the frameless SonArray system and cone-beam computed tomography (CBCT) for volunteer and phantom cases, respectively. The performance of the OTGS with commercial immobilization devices and under treatment conditions (i.e., couch rotation and translation range) was also evaluated. Results: The difference in the calibration/registration accuracy of 24 translations or rotation combinations between CBCT and in-house OTS software was within 0.5 mm/0.4°. The mean intrafraction and interfraction accuracy among the volunteers was 0.004+/−0.4mm with −0.09+/−0.5° (n=6,170) and −0.26+/−0.8mm with 0.15+/0.8° (n=11), respectively. The difference in end-to-end accuracy between the OTGS and CBCT was within 1.3 mm/1.1°. The predetermined marker pattern (1) minimized marker occlusions, (2) allowed for continuous tracking for couch angles +/− 90°, (3) and eliminated individual marker misplacement. The device was feasible with open and half masks for immobilization. Conclusion: Bony anatomical localization eliminated potential errors due to facial hair changes and/or soft tissue deformation. The OTGS offers a workflow-friendly, patient-friendly solution for intracranial SRT, while being comparable to other real-time options. The minimum rotation uncertainty of the OTGS can be combined with CBCT to ensure maximum accuracy for high-precision SRT.« less
Kielar, Kayla N; Mok, Ed; Hsu, Annie; Wang, Lei; Luxton, Gary
2012-10-01
The dosimetric leaf gap (DLG) in the Varian Eclipse treatment planning system is determined during commissioning and is used to model the effect of the rounded leaf-end of the multileaf collimator (MLC). This parameter attempts to model the physical difference between the radiation and light field and account for inherent leakage between leaf tips. With the increased use of single fraction high dose treatments requiring larger monitor units comes an enhanced concern in the accuracy of leakage calculations, as it accounts for much of the patient dose. This study serves to verify the dosimetric accuracy of the algorithm used to model the rounded leaf effect for the TrueBeam STx, and describes a methodology for determining best-practice parameter values, given the novel capabilities of the linear accelerator such as flattening filter free (FFF) treatments and a high definition MLC (HDMLC). During commissioning, the nominal MLC position was verified and the DLG parameter was determined using MLC-defined field sizes and moving gap tests, as is common in clinical testing. Treatment plans were created, and the DLG was optimized to achieve less than 1% difference between measured and calculated dose. The DLG value found was tested on treatment plans for all energies (6 MV, 10 MV, 15 MV, 6 MV FFF, 10 MV FFF) and modalities (3D conventional, IMRT, conformal arc, VMAT) available on the TrueBeam STx. The DLG parameter found during the initial MLC testing did not match the leaf gap modeling parameter that provided the most accurate dose delivery in clinical treatment plans. Using the physical leaf gap size as the DLG for the HDMLC can lead to 5% differences in measured and calculated doses. Separate optimization of the DLG parameter using end-to-end tests must be performed to ensure dosimetric accuracy in the modeling of the rounded leaf ends for the Eclipse treatment planning system. The difference in leaf gap modeling versus physical leaf gap dimensions is more pronounced in the more recent versions of Eclipse for both the HDMLC and the Millennium MLC. Once properly commissioned and tested using a methodology based on treatment plan verification, Eclipse is able to accurately model radiation dose delivered for SBRT treatments using the TrueBeam STx.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Abhijit; Voter, Arthur
2009-01-01
We develop a variation of the temperature accelerated dynamics (TAD) method, called the p-TAD method, that efficiently generates an on-the-fly kinetic Monte Carlo (KMC) process catalog with control over the accuracy of the catalog. It is assumed that transition state theory is valid. The p-TAD method guarantees that processes relevant at the timescales of interest to the simulation are present in the catalog with a chosen confidence. A confidence measure associated with the process catalog is derived. The dynamics is then studied using the process catalog with the KMC method. Effective accuracy of a p-TAD calculation is derived when amore » KMC catalog is reused for conditions different from those the catalog was originally generated for. Different KMC catalog generation strategies that exploit the features of the p-TAD method and ensure higher accuracy and/or computational efficiency are presented. The accuracy and the computational requirements of the p-TAD method are assessed. Comparisons to the original TAD method are made. As an example, we study dynamics in sub-monolayer Ag/Cu(110) at the time scale of seconds using the p-TAD method. It is demonstrated that the p-TAD method overcomes several challenges plaguing the conventional KMC method.« less
Interpolation methods and the accuracy of lattice-Boltzmann mesh refinement
Guzik, Stephen M.; Weisgraber, Todd H.; Colella, Phillip; ...
2013-12-10
A lattice-Boltzmann model to solve the equivalent of the Navier-Stokes equations on adap- tively refined grids is presented. A method for transferring information across interfaces between different grid resolutions was developed following established techniques for finite- volume representations. This new approach relies on a space-time interpolation and solving constrained least-squares problems to ensure conservation. The effectiveness of this method at maintaining the second order accuracy of lattice-Boltzmann is demonstrated through a series of benchmark simulations and detailed mesh refinement studies. These results exhibit smaller solution errors and improved convergence when compared with similar approaches relying only on spatial interpolation. Examplesmore » highlighting the mesh adaptivity of this method are also provided.« less
Absolute Gravity Datum in the Age of Cold Atom Gravimeters
NASA Astrophysics Data System (ADS)
Childers, V. A.; Eckl, M. C.
2014-12-01
The international gravity datum is defined today by the International Gravity Standardization Net of 1971 (IGSN-71). The data supporting this network was measured in the 1950s and 60s using pendulum and spring-based gravimeter ties (plus some new ballistic absolute meters) to replace the prior protocol of referencing all gravity values to the earlier Potsdam value. Since this time, gravimeter technology has advanced significantly with the development and refinement of the FG-5 (the current standard of the industry) and again with the soon-to-be-available cold atom interferometric absolute gravimeters. This latest development is anticipated to provide improvement in the range of two orders of magnitude as compared to the measurement accuracy of technology utilized to develop ISGN-71. In this presentation, we will explore how the IGSN-71 might best be "modernized" given today's requirements and available instruments and resources. The National Geodetic Survey (NGS), along with other relevant US Government agencies, is concerned about establishing gravity control to establish and maintain high order geodetic networks as part of the nation's essential infrastructure. The need to modernize the nation's geodetic infrastructure was highlighted in "Precise Geodetic Infrastructure, National Requirements for a Shared Resource" National Academy of Science, 2010. The NGS mission, as dictated by Congress, is to establish and maintain the National Spatial Reference System, which includes gravity measurements. Absolute gravimeters measure the total gravity field directly and do not involve ties to other measurements. Periodic "intercomparisons" of multiple absolute gravimeters at reference gravity sites are used to constrain the behavior of the instruments to ensure that each would yield reasonably similar measurements of the same location (i.e. yield a sufficiently consistent datum when measured in disparate locales). New atomic interferometric gravimeters promise a significant increase in accuracy. Our presentation will also explore the impact of such an instrument on our theory of how to constrain the gravity datum and on how to ensure stability, repeatability, and reproducibility across different absolute gravimeter systems.
Unlocking the potential of small unmanned aircraft systems (sUAS) for Earth observation
NASA Astrophysics Data System (ADS)
Hugenholtz, C.; Riddell, K.; Barchyn, T. E.
2012-12-01
Small unmanned aircraft systems (sUAS, < 25 kg) are emerging as a viable alternative to conventional remote sensing platforms for Earth observation (EO). sUAS technology affords greater control, lower cost, and flexibility for scientists, and provides new opportunities to match the scale of sUAS data to the scale of the geophysical phenomenon under investigation. Although a mechanism is in place to make sUAS available to researchers and other non-military users through the US Federal Aviation Administration's Modernization and Reform Act of 2012 (FAAMRA), there are many regulatory hurdles before they are fully accepted and integrated into the National Airspace System. In this talk we will provide a brief overview of the regulatory landscape for sUAS, both in the USA and in Canada, where sUAS regulations are more flexible. We critically outline potential advantages and disadvantages of sUAS for EO applications under current and potential regulations. We find advantages: relatively low cost, potentially high temporal resolution, rapidly improving technology, and operational flexibility. We also find disadvantages: limited temporal and spatial extent, limited accuracy assessment and methodological development, and an immature regulatory landscape. From a case study we show an example of the accuracy of a photogrammetrically-derived digital terrain map (DTM) from sUAS imagery. We also compare the sUAS DTM to a LiDAR DTM. Our results suggest that sUAS-acquired imagery may provide a low-cost, rapid, and flexible alternative to airborne LiDAR. Overall, we are encouraged about the potential of sUAS for geophysical measurements; however, understanding and compliance with regulations is paramount to ensure that research is conducted legally and responsibly. Because UAS are new outside of military operations, we hope researchers will proceed carefully to ensure this great scientific opportunity remains a long term tool.
NASA's global differential GPS system and the TDRSS augmentation service for satellites
NASA Technical Reports Server (NTRS)
Bar-Sever, Yoaz; Young, Larry; Stocklin, Frank; Rush, John
2004-01-01
NASA is planning to launch a new service for Earth satellites providing them with precise GPS differential corrections and other ancillary information enabling decimeter level orbit determination accuracy, and nanosecond time-transfer accuracy, onboard, in real-time. The TDRSS Augmentation Service for Satellites (TASS) will broadcast its message on the S-band multiple access channel of NASA's Tracking and Data Relay Satellite System (TDRSS). The satellite's phase array antenna has been configured to provide a wide beam, extending coverage up to 1000 km altitude over the poles. Global coverage will be ensured with broadcast from three or more TDRSS satellites. The GPS differential corrections are provided by the NASA Global Differential GPS (GDGPS) System, developed and operated by NASA's Jet Propulsion Laboratory. The GDGPS System employs a global ground network of more than 70 GPS receivers to monitor the GPS constellation in real time. The system provides real-time estimates of the GPS satellite states, as well as many other real-time products such as differential corrections, global ionospheric maps, and integrity monitoring. The unique multiply redundant architecture of the GDGPS System ensures very high reliability, with 99.999% demonstrated since the inception of the system in Early 2000. The estimated real time GPS orbit and clock states provided by the GDGPS system are accurate to better than 20 cm 3D RMS, and have been demonstrated to support sub-decimeter real time positioning and orbit determination for a variety of terrestrial, airborne, and spaceborne applications. In addition to the GPS differential corrections, TASS will provide real-time Earth orientation and solar flux information that enable precise onboard knowledge of the Earth-fixed position of the spacecraft, and precise orbit prediction and planning capabilities. TASS will also provide 5 seconds alarms for GPS integrity failures based on the unique GPS integrity monitoring service of the GDGPS System.
An Introduction to the Global Space-based Inter-Calibration System from a EUMETSAT Perspective
NASA Astrophysics Data System (ADS)
Wagner, S. C.; Hewison, T.; Roebeling, R. A.; Koenig, M.; Schulz, J.; Miu, P.
2012-04-01
The Global Space-based Inter-Calibration System (GSICS) (Goldberg and al. 2011) is an international collaborative effort which aims to monitor, improve and harmonize the quality of observations from operational weather and environmental satellites of the Global Observing System (GOS). GSICS aims at ensuring consistent accuracy among space-based observations worldwide for climate monitoring, weather forecasting, and environmental applications. This is achieved through a comprehensive calibration strategy, which involves monitoring instrument performances, operational inter-calibration of satellite instruments, tying the measurements to absolute references and standards, and recalibration of archived data. A major part of this strategy involves direct comparison of collocated observations from pairs of satellite instruments, which are used to systematically generate calibration functions to compare and correct the calibration of monitored instruments to references. These GSICS Corrections are needed for accurately integrating data from multiple observing systems into both near real-time and re-analysis products, applications and services. This paper gives more insight into the activities carried out by EUMETSAT as a GSICS Processing and Research Centre. Currently these are closely bound to the in-house development and operational implementation of calibration methods for solar and thermal band channels of geostationary and polar-orbiting satellites. They include inter-calibration corrections for Meteosat imagers using reference instruments such as the Moderate Resolution Imaging Spectroradiometer (MODIS) on-board the Aqua satellite for solar band channels, the Infrared Atmospheric Sounding Interferometer (IASI) on-board Metop-A and, for historic archive data, the High-resolution InfraRed Sounder (HIRS). Additionally, bias monitoring is routinely performed, allowing users to visualise the calibration accuracy of the instruments in near real-time. These activities are based on principles and protocols defined by the GSICS Research Working Group and Data Management Working Group, which require assessment of the calibration uncertainties to ensure the traceability to community references.
How do schizophrenia patients use visual information to decode facial emotion?
Lee, Junghee; Gosselin, Frédéric; Wynn, Jonathan K; Green, Michael F
2011-09-01
Impairment in recognizing facial emotions is a prominent feature of schizophrenia patients, but the underlying mechanism of this impairment remains unclear. This study investigated the specific aspects of visual information that are critical for schizophrenia patients to recognize emotional expression. Using the Bubbles technique, we probed the use of visual information during a facial emotion discrimination task (fear vs. happy) in 21 schizophrenia patients and 17 healthy controls. Visual information was sampled through randomly located Gaussian apertures (or "bubbles") at 5 spatial frequency scales. Online calibration of the amount of face exposed through bubbles was used to ensure 75% overall accuracy for each subject. Least-square multiple linear regression analyses between sampled information and accuracy were performed to identify critical visual information that was used to identify emotional expression. To accurately identify emotional expression, schizophrenia patients required more exposure of facial areas (i.e., more bubbles) compared with healthy controls. To identify fearful faces, schizophrenia patients relied less on bilateral eye regions at high-spatial frequency compared with healthy controls. For identification of happy faces, schizophrenia patients relied on the mouth and eye regions; healthy controls did not utilize eyes and used the mouth much less than patients did. Schizophrenia patients needed more facial information to recognize emotional expression of faces. In addition, patients differed from controls in their use of high-spatial frequency information from eye regions to identify fearful faces. This study provides direct evidence that schizophrenia patients employ an atypical strategy of using visual information to recognize emotional faces.
Effects of voxelization on dose volume histogram accuracy
NASA Astrophysics Data System (ADS)
Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor
2016-03-01
PURPOSE: In radiotherapy treatment planning systems, structures of interest such as targets and organs at risk are stored as 2D contours on evenly spaced planes. In order to be used in various algorithms, contours must be converted into binary labelmap volumes using voxelization. The voxelization process results in lost information, which has little effect on the volume of large structures, but has significant impact on small structures, which contain few voxels. Volume differences for segmented structures affects metrics such as dose volume histograms (DVH), which are used for treatment planning. Our goal is to evaluate the impact of voxelization on segmented structures, as well as how factors like voxel size affects metrics, such as DVH. METHODS: We create a series of implicit functions, which represent simulated structures. These structures are sampled at varying resolutions, and compared to labelmaps with high sub-millimeter resolutions. We generate DVH and evaluate voxelization error for the same structures at different resolutions by calculating the agreement acceptance percentage between the DVH. RESULTS: We implemented tools for analysis as modules in the SlicerRT toolkit based on the 3D Slicer platform. We found that there were large DVH variation from the baseline for small structures or for structures located in regions with a high dose gradient, potentially leading to the creation of suboptimal treatment plans. CONCLUSION: This work demonstrates that labelmap and dose volume voxel size is an important factor in DVH accuracy, which must be accounted for in order to ensure the development of accurate treatment plans.
Calibration of GPS based high accuracy speed meter for vehicles
NASA Astrophysics Data System (ADS)
Bai, Yin; Sun, Qiao; Du, Lei; Yu, Mei; Bai, Jie
2015-02-01
GPS based high accuracy speed meter for vehicles is a special type of GPS speed meter which uses Doppler Demodulation of GPS signals to calculate the speed of a moving target. It is increasingly used as reference equipment in the field of traffic speed measurement, but acknowledged standard calibration methods are still lacking. To solve this problem, this paper presents the set-ups of simulated calibration, field test signal replay calibration, and in-field test comparison with an optical sensor based non-contact speed meter. All the experiments were carried out on particular speed values in the range of (40-180) km/h with the same GPS speed meter. The speed measurement errors of simulated calibration fall in the range of +/-0.1 km/h or +/-0.1%, with uncertainties smaller than 0.02% (k=2). The errors of replay calibration fall in the range of +/-0.1% with uncertainties smaller than 0.10% (k=2). The calibration results justify the effectiveness of the two methods. The relative deviations of the GPS speed meter from the optical sensor based noncontact speed meter fall in the range of +/-0.3%, which validates the use of GPS speed meter as reference instruments. The results of this research can provide technical basis for the establishment of internationally standard calibration methods of GPS speed meters, and thus ensures the legal status of GPS speed meters as reference equipment in the field of traffic speed metrology.
Forward collision warning based on kernelized correlation filters
NASA Astrophysics Data System (ADS)
Pu, Jinchuan; Liu, Jun; Zhao, Yong
2017-07-01
A vehicle detection and tracking system is one of the indispensable methods to reduce the occurrence of traffic accidents. The nearest vehicle is the most likely to cause harm to us. So, this paper will do more research on about the nearest vehicle in the region of interest (ROI). For this system, high accuracy, real-time and intelligence are the basic requirement. In this paper, we set up a system that combines the advanced KCF tracking algorithm with the HaarAdaBoost detection algorithm. The KCF algorithm reduces computation time and increase the speed through the cyclic shift and diagonalization. This algorithm satisfies the real-time requirement. At the same time, Haar features also have the same advantage of simple operation and high speed for detection. The combination of this two algorithm contribute to an obvious improvement of the system running rate comparing with previous works. The detection result of the HaarAdaBoost classifier provides the initial value for the KCF algorithm. This fact optimizes KCF algorithm flaws that manual car marking in the initial phase, which is more scientific and more intelligent. Haar detection and KCF tracking with Histogram of Oriented Gradient (HOG) ensures the accuracy of the system. We evaluate the performance of framework on dataset that were self-collected. The experimental results demonstrate that the proposed method is robust and real-time. The algorithm can effectively adapt to illumination variation, even in the night it can meet the detection and tracking requirements, which is an improvement compared with the previous work.
NASA Astrophysics Data System (ADS)
Tonkin, T. N.; Midgley, N. G.; Graham, D. J.; Labadz, J. C.
2014-12-01
Novel topographic survey methods that integrate both structure-from-motion (SfM) photogrammetry and small unmanned aircraft systems (sUAS) are a rapidly evolving investigative technique. Due to the diverse range of survey configurations available and the infancy of these new methods, further research is required. Here, the accuracy, precision and potential applications of this approach are investigated. A total of 543 images of the Cwm Idwal moraine-mound complex were captured from a light (< 5 kg) semi-autonomous multi-rotor unmanned aircraft system using a consumer-grade 18 MP compact digital camera. The images were used to produce a DSM (digital surface model) of the moraines. The DSM is in good agreement with 7761 total station survey points providing a total vertical RMSE value of 0.517 m and vertical RMSE values as low as 0.200 m for less densely vegetated areas of the DSM. High-precision topographic data can be acquired rapidly using this technique with the resulting DSMs and orthorectified aerial imagery at sub-decimetre resolutions. Positional errors on the total station dataset, vegetation and steep terrain are identified as the causes of vertical disagreement. Whilst this aerial survey approach is advocated for use in a range of geomorphological settings, care must be taken to ensure that adequate ground control is applied to give a high degree of accuracy.
Integrated Maintenance Information System (IMIS): A Maintenance Information Delivery Concept.
1987-11-01
InterFace Figure 2. Portable Maintenance Computer Concept. provide advice for difficult fault-isolation problems . The technician will be able to accomplish...faced with an ever-growing number of paper-based technical orders (TOs). This has greatly increased costs and distribution problems . In addition, it has...compounded problems associ- ated with ensuring accurate data and the lengthy correction times involved. To improve the accuracy of technical data and
Naito, H K
1989-03-01
We have approached a dawn of a new era in detection, evaluation, treatment, and monitoring of individuals with elevated blood cholesterol levels who are at increased risk for CHD. The NHLBI's National Cholesterol Education Program will be the major force underlying this national awareness program, which is dependent on the clinical laboratories providing reliable data. Precision or reproducibility of results is not a problem for most of the laboratories, but accuracy is a major concern. Both the manufacturers and laboratorians need to standardize the measurement for cholesterol so that the accuracy base is traceable to the NCCLS NRS/CHOL. The manufacturers need to adopt a uniform policy that will ensure that the values assigned to calibration, quality control, and quality assurance or survey materials are accurate and traceable to the NCCLS/CHOL. Since, at present, there are some limitations of these materials caused by matrix effects, laboratories are encouraged to use the CDC-NHLBI National Reference Laboratory Network to evaluate and monitor their ability to measure patient blood cholesterol levels accurately. Major areas of analytical problems are identified and general, as well as specific, recommendations are provided to help ensure reliable measurement of cholesterol in patient specimens.
Kelley, A E; Will, M J; Steininger, T L; Zhang, M; Haber, S N
2003-11-01
Brain opioid peptide systems are known to play an important role in motivation, emotion, attachment behaviour, the response to stress and pain, and the control of food intake. Opioid peptides within the ventral striatum are thought to play a key role in the latter function, regulating the affective response to highly palatable, energy-dense foods such as those containing fat and sugar. It has been shown previously that stimulation of mu opiate receptors within the ventral striatum increases intake of palatable food. In the present study, we examined enkephalin peptide gene expression within the striatum in rats that had been given restricted daily access to an energy-dense, palatable liquid food, chocolate Ensure(R). Rats maintained on an ad libitum diet of rat chow and water were given 3-h access to Ensure(R) daily for two weeks. One day following the end of this period, preproenkephalin gene expression was measured with quantitative in situ hybridization. Compared with control animals, rats that had been exposed to Ensure(R) had significantly reduced enkephalin gene expression in several striatal regions including the ventral striatum (nucleus accumbens), a finding that was confirmed in a different group with Northern blot analysis. Rats fed this regimen of Ensure(R) did not differ in weight from controls. In contrast to chronic Ensure(R), acute ingestion of Ensure(R) did not appear to affect enkephalin peptide gene expression. These results suggest that repeated consumption of a highly rewarding, energy-dense food induces neuroadaptations in cognitive-motivational circuits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Able, Charles M., E-mail: cable@wfubmc.edu; Bright, Megan; Frizzell, Bart
Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles withmore » 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.« less
A three-dimensional wide-angle BPM for optical waveguide structures.
Ma, Changbao; Van Keuren, Edward
2007-01-22
Algorithms for effective modeling of optical propagation in three- dimensional waveguide structures are critical for the design of photonic devices. We present a three-dimensional (3-D) wide-angle beam propagation method (WA-BPM) using Hoekstra's scheme. A sparse matrix algebraic equation is formed and solved using iterative methods. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation, along with a technique for shifting the simulation window to reduce the dimension of the numerical equation and a threshold technique to further ensure its convergence. These techniques can ensure the implementation of iterative methods for waveguide structures by relaxing the convergence problem, which will further enable us to develop higher-order 3-D WA-BPMs based on Padé approximant operators.
Safeguarding Confidentiality in Electronic Health Records.
Shenoy, Akhil; Appel, Jacob M
2017-04-01
Electronic health records (EHRs) offer significant advantages over paper charts, such as ease of portability, facilitated communication, and a decreased risk of medical errors; however, important ethical concerns related to patient confidentiality remain. Although legal protections have been implemented, in practice, EHRs may be still prone to breaches that threaten patient privacy. Potential safeguards are essential, and have been implemented especially in sensitive areas such as mental illness, substance abuse, and sexual health. Features of one institutional model are described that may illustrate the efforts to both ensure adequate transparency and ensure patient confidentiality. Trust and the therapeutic alliance are critical to the provider-patient relationship and quality healthcare services. All of the benefits of an EHR are only possible if patients retain confidence in the security and accuracy of their medical records.
A three-dimensional wide-angle BPM for optical waveguide structures
NASA Astrophysics Data System (ADS)
Ma, Changbao; van Keuren, Edward
2007-01-01
Algorithms for effective modeling of optical propagation in three- dimensional waveguide structures are critical for the design of photonic devices. We present a three-dimensional (3-D) wide-angle beam propagation method (WA-BPM) using Hoekstra’s scheme. A sparse matrix algebraic equation is formed and solved using iterative methods. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation, along with a technique for shifting the simulation window to reduce the dimension of the numerical equation and a threshold technique to further ensure its convergence. These techniques can ensure the implementation of iterative methods for waveguide structures by relaxing the convergence problem, which will further enable us to develop higher-order 3-D WA-BPMs based on Padé approximant operators.
A bio-impedance probe to assess liver steatosis during transplant surgery.
Smith, Penny Probert; You, Fusheng; Vogel, Thomas; Silva, Michael
2011-01-01
This work addresses the design of a bioimpedance probe to assess steatosis on the exposed liver in the donor during liver transplant surgery. Whereas typically bioimpedance uses needle probes to avoid surface effects, for clinical reasons a non-penetrative probe is required. In addition the need to ensure that the measurement is representative of the bulk tissue suggests a larger probe than is normally used to ensure a sufficiently large measurement volume. Using a simple model, simulations and tests on bovine liver, this paper investigates the relationship between probe dimensions and depth of measurement penetration and investigates the accuracy which might be expected in a configuration suitable for use in the operating theatre on intact but exposed livers. A probe using ECG electrodes is proposed and investigated.
New soft magnetic amorphous cobalt based alloys with high hysteresis loop linearity
NASA Astrophysics Data System (ADS)
Nosenko, V. K.; Maslov, V. V.; Kochkubey, A. P.; Kirilchuk, V. V.
2008-02-01
The new amorphous Co56÷59(Fe,Ni,Mn)21÷24(Si0.2B0.8)20-based metal alloys (AMA) with high saturation induction (BS>=1T) were developed. Toroidal tape wound magnetic cores made from these AMA after heat-magnetic treatment (HMT) in a reversal field are characterized by high hysteresis loop linearity, minimum effective magnetic permeability and its high field stability in combination with low coercivity Hc (1-3 A/m, 1 kHz). For the most prospecting alloy compositions the value of effective magnetic permeability decreases compared to known alloys up to 550 - 670 units and remains constant in the wide magnetic field range 1100 - 1300 A/m. Maximum remagnetization loop linearity is achieved after optimum HMT in high Ni containing AMAs, which are characterized by the record low squareness ratio values Ks=0.002-0.02 and Hc=1.0 A/m. Magnetic cores made from the new amorphous alloys can be used both in filter chokes of switch-mode power supply units and in matching mini-transformers of telecommunication systems; at that, high efficiency and accuracy of signal transmission including high frequency pulses are ensured under conditions of long-term influence of dc magnetic bias.
NASA Astrophysics Data System (ADS)
Ying, Jia-ju; Yin, Jian-ling; Wu, Dong-sheng; Liu, Jie; Chen, Yu-dan
2017-11-01
Low-light level night vision device and thermal infrared imaging binocular photoelectric instrument are used widely. The maladjustment of binocular instrument ocular axises parallelism will cause the observer the symptom such as dizziness, nausea, when use for a long time. Binocular photoelectric equipment digital calibration instrument is developed for detecting ocular axises parallelism. And the quantitative value of optical axis deviation can be quantitatively measured. As a testing instrument, the precision must be much higher than the standard of test instrument. Analyzes the factors that influence the accuracy of detection. Factors exist in each testing process link which affect the precision of the detecting instrument. They can be divided into two categories, one category is factors which directly affect the position of reticle image, the other category is factors which affect the calculation the center of reticle image. And the Synthesize error is calculated out. And further distribute the errors reasonably to ensure the accuracy of calibration instruments.
Reconstruction method for fringe projection profilometry based on light beams.
Li, Xuexing; Zhang, Zhijiang; Yang, Chen
2016-12-01
A novel reconstruction method for fringe projection profilometry, based on light beams, is proposed and verified by experiments. Commonly used calibration techniques require the parameters of projector calibration or the reference planes placed in many known positions. Obviously, introducing the projector calibration can reduce the accuracy of the reconstruction result, and setting the reference planes to many known positions is a time-consuming process. Therefore, in this paper, a reconstruction method without projector's parameters is proposed and only two reference planes are introduced. A series of light beams determined by the subpixel point-to-point map on the two reference planes combined with their reflected light beams determined by the camera model are used to calculate the 3D coordinates of reconstruction points. Furthermore, the bundle adjustment strategy and the complementary gray-code phase-shifting method are utilized to ensure the accuracy and stability. Qualitative and quantitative comparisons as well as experimental tests demonstrate the performance of our proposed approach, and the measurement accuracy can reach about 0.0454 mm.
Novel 3-D free-form surface profilometry for reverse engineering
NASA Astrophysics Data System (ADS)
Chen, Liang-Chia; Huang, Zhi-Xue
2005-01-01
This article proposes an innovative 3-D surface contouring approach for automatic and accurate free-form surface reconstruction using a sensor integration concept. The study addresses a critical problem in accurate measurement of free-form surfaces by developing an automatic reconstruction approach. Unacceptable measuring accuracy issues are mainly due to the errors arising from the use of inadequate measuring strategies, ending up with inaccurate digitised data and costly post-data processing in Reverse Engineering (RE). This article is thus aimed to develop automatic digitising strategies for ensuring surface reconstruction efficiency, as well as accuracy. The developed approach consists of two main stages, namely the rapid shape identification (RSI) and the automated laser scanning (ALS) for completing 3-D surface profilometry. This developed approach effectively utilises the advantages of on-line geometric information to evaluate the degree of satisfaction of user-defined digitising accuracy under a triangular topological patch. An industrial case study was used to attest the feasibility of the approach.
Estimating accuracy of land-cover composition from two-stage cluster sampling
Stehman, S.V.; Wickham, J.D.; Fattorini, L.; Wade, T.D.; Baffetta, F.; Smith, J.H.
2009-01-01
Land-cover maps are often used to compute land-cover composition (i.e., the proportion or percent of area covered by each class), for each unit in a spatial partition of the region mapped. We derive design-based estimators of mean deviation (MD), mean absolute deviation (MAD), root mean square error (RMSE), and correlation (CORR) to quantify accuracy of land-cover composition for a general two-stage cluster sampling design, and for the special case of simple random sampling without replacement (SRSWOR) at each stage. The bias of the estimators for the two-stage SRSWOR design is evaluated via a simulation study. The estimators of RMSE and CORR have small bias except when sample size is small and the land-cover class is rare. The estimator of MAD is biased for both rare and common land-cover classes except when sample size is large. A general recommendation is that rare land-cover classes require large sample sizes to ensure that the accuracy estimators have small bias. ?? 2009 Elsevier Inc.
The Effect of Normalization in Violence Video Classification Performance
NASA Astrophysics Data System (ADS)
Ali, Ashikin; Senan, Norhalina
2017-08-01
Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.
Burch, Ezra A; Shyn, Paul B; Chick, Jeffrey F; Chauhan, Nikunj R
2017-04-01
The purpose of this study was to determine whether auditing an online self-reported interventional radiology quality assurance database improves compliance with record entry or improves the accuracy of adverse event (AE) reporting and grading. Physicians were trained in using the database before the study began. An audit of all database entries for the first 3 months, or the first quarter, was performed, at which point physicians were informed of the audit process; entries for the subsequent 3 months, or the second quarter, were again audited. Results between quarters were compared. Compliance with record entry improved from the first to second quarter, but reminders were necessary to ensure 100% compliance with record entry. Knowledge of the audit process did not significantly improve self-reporting of AE or accuracy of AE grading. However, auditing significantly changed the final AE reporting rates and grades. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Precise positioning method for multi-process connecting based on binocular vision
NASA Astrophysics Data System (ADS)
Liu, Wei; Ding, Lichao; Zhao, Kai; Li, Xiao; Wang, Ling; Jia, Zhenyuan
2016-01-01
With the rapid development of aviation and aerospace, the demand for metal coating parts such as antenna reflector, eddy-current sensor and signal transmitter, etc. is more and more urgent. Such parts with varied feature dimensions, complex three-dimensional structures, and high geometric accuracy are generally fabricated by the combination of different manufacturing technology. However, it is difficult to ensure the machining precision because of the connection error between different processing methods. Therefore, a precise positioning method is proposed based on binocular micro stereo vision in this paper. Firstly, a novel and efficient camera calibration method for stereoscopic microscope is presented to solve the problems of narrow view field, small depth of focus and too many nonlinear distortions. Secondly, the extraction algorithms for law curve and free curve are given, and the spatial position relationship between the micro vision system and the machining system is determined accurately. Thirdly, a precise positioning system based on micro stereovision is set up and then embedded in a CNC machining experiment platform. Finally, the verification experiment of the positioning accuracy is conducted and the experimental results indicated that the average errors of the proposed method in the X and Y directions are 2.250 μm and 1.777 μm, respectively.
Specific issues in small animal dosimetry and irradiator calibration
Yoshizumi, Terry; Brady, Samuel L.; Robbins, Mike E.; Bourland, J. Daniel
2013-01-01
Purpose In response to the increased risk of radiological terrorist attack, a network of Centers for Medical Countermeasures against Radiation (CMCR) has been established in the United States, focusing on evaluating animal model responses to uniform, relatively homogenous whole- or partial-body radiation exposures at relatively high dose rates. The success of such studies is dependent not only on robust animal models but on accurate and reproducible dosimetry within and across CMCR. To address this issue, the Education and Training Core of the Duke University School of Medicine CMCR organised a one-day workshop on small animal dosimetry. Topics included accuracy in animal dosimetry accuracy, characteristics and differences of cesium-137 and X-ray irradiators, methods for dose measurement, and design of experimental irradiation geometries for uniform dose distributions. This paper summarises the information presented and discussed. Conclusions Without ensuring accurate and reproducible dosimetry the development and assessment of the efficacy of putative countermeasures will not prove successful. Radiation physics support is needed, but is often the weakest link in the small animal dosimetry chain. We recommend: (i) A user training program for new irradiator users, (ii) subsequent training updates, and (iii) the establishment of a national small animal dosimetry center for all CMCR members. PMID:21961967
NASA Astrophysics Data System (ADS)
Mu, Nan; Wang, Kun; Xie, Zexiao; Ren, Ping
2017-05-01
To realize online rapid measurement for complex workpieces, a flexible measurement system based on an articulated industrial robot with a structured light sensor mounted on the end-effector is developed. A method for calibrating the system parameters is proposed in which the hand-eye transformation parameters and the robot kinematic parameters are synthesized in the calibration process. An initial hand-eye calibration is first performed using a standard sphere as the calibration target. By applying the modified complete and parametrically continuous method, we establish a synthesized kinematic model that combines the initial hand-eye transformation and distal link parameters as a whole with the sensor coordinate system as the tool frame. According to the synthesized kinematic model, an error model is constructed based on spheres' center-to-center distance errors. Consequently, the error model parameters can be identified in a calibration experiment using a three-standard-sphere target. Furthermore, the redundancy of error model parameters is eliminated to ensure the accuracy and robustness of the parameter identification. Calibration and measurement experiments are carried out based on an ER3A-C60 robot. The experimental results show that the proposed calibration method enjoys high measurement accuracy, and this efficient and flexible system is suitable for online measurement in industrial scenes.
NASA Astrophysics Data System (ADS)
Benduch, Piotr; Pęska-Siwik, Agnieszka
2017-06-01
A parcel is the most important object of real estate cadastre. Its primary spatial attribute are boundaries, determining the extent of property rights. Capturing the data on boundaries should be performed in the way ensuring sufficiently high accuracy and reliability. In recent years, as part of the project "ZSIN - Construction of Integrated Real Estate Information System - Stage I", in the territories of the participating districts, actions were taken aimed at the modernization of the register of land and buildings. In many cases, this process was carried out basing on photogrammetric materials. Applicable regulations allow such a possibility. This paper, basing on the documentation from the National Geodetic and Cartographic Documentation Center and on the authors' own surveys attempts to assess the applicability of the photogrammetric method to capture data on the boundaries of cadastral parcels. The scope of the research, most importantly, included the problem of accuracy with which it was possible to determine the position of a boundary point using photogrammetric surveys carried out on the terrain model created from processed aerial photographs. The article demonstrates the manner of recording this information in the cadastral database, as well as the resulting legal consequences. Moreover, the level of reliability of the entered values of the selected attributes of boundary points was assessed.
The accuracy of prehospital diagnosis of acute cerebrovascular accidents: an observational study.
Karliński, Michał; Gluszkiewicz, Marcin; Członkowska, Anna
2015-06-19
Time to treatment is the key factor in stroke care. Although the initial medical assessment is usually made by a non-neurologist or a paramedic, it should ensure correct identification of all acute cerebrovascular accidents (CVAs). Our aim was to evaluate the accuracy of the physician-made prehospital diagnosis of acute CVA in patients referred directly to the neurological emergency department (ED), and to identify conditions mimicking CVAs. This observational study included consecutive patients referred to our neurological ED by emergency physicians with a suspicion of CVA (acute stroke, transient ischemic attack (TIA) or a syndrome-based diagnosis) during 12 months. Referrals were considered correct if the prehospital diagnosis of CVA proved to be stroke or TIA. The prehospital diagnosis of CVA was correct in 360 of 570 cases. Its positive predictive value ranged from 100% for the syndrome-based diagnosis, through 70% for stroke, to 34% for TIA. Misdiagnoses were less frequent among ambulance physicians compared to primary care and outpatient physicians (33% vs. 52%, p < 0.001). The most frequent mimics were vertigo (19%), electrolyte and metabolic disturbances (12%), seizures (11%), cardiovascular disorders (10%), blood hypertension (8%) and brain tumors (5%). Additionally, 6% of all admitted CVA cases were referred with prehospital diagnoses other than CVA. Emergency physicians appear to be sensitive in diagnosing CVAs but their overall accuracy does not seem high. They tend to overuse the diagnosis of TIA. Constant education and adoption of stroke screening scales may be beneficial for emergency care systems based both on physicians and on paramedics.
The accuracy of prehospital diagnosis of acute cerebrovascular accidents: an observational study
Gluszkiewicz, Marcin; Członkowska, Anna
2015-01-01
Introduction Time to treatment is the key factor in stroke care. Although the initial medical assessment is usually made by a non-neurologist or a paramedic, it should ensure correct identification of all acute cerebrovascular accidents (CVAs). Our aim was to evaluate the accuracy of the physician-made prehospital diagnosis of acute CVA in patients referred directly to the neurological emergency department (ED), and to identify conditions mimicking CVAs. Material and methods This observational study included consecutive patients referred to our neurological ED by emergency physicians with a suspicion of CVA (acute stroke, transient ischemic attack (TIA) or a syndrome-based diagnosis) during 12 months. Referrals were considered correct if the prehospital diagnosis of CVA proved to be stroke or TIA. Results The prehospital diagnosis of CVA was correct in 360 of 570 cases. Its positive predictive value ranged from 100% for the syndrome-based diagnosis, through 70% for stroke, to 34% for TIA. Misdiagnoses were less frequent among ambulance physicians compared to primary care and outpatient physicians (33% vs. 52%, p < 0.001). The most frequent mimics were vertigo (19%), electrolyte and metabolic disturbances (12%), seizures (11%), cardiovascular disorders (10%), blood hypertension (8%) and brain tumors (5%). Additionally, 6% of all admitted CVA cases were referred with prehospital diagnoses other than CVA. Conclusions Emergency physicians appear to be sensitive in diagnosing CVAs but their overall accuracy does not seem high. They tend to overuse the diagnosis of TIA. Constant education and adoption of stroke screening scales may be beneficial for emergency care systems based both on physicians and on paramedics. PMID:26170845
Development of an in vitro diaphragm motion reproduction system.
Liao, Ai-Ho; Chuang, Ho-Chiao; Shih, Ming-Chih; Hsu, Hsiao-Yu; Tien, Der-Chi; Kuo, Chia-Chun; Jeng, Shiu-Chen; Chiou, Jeng-Fong
2017-07-01
This study developed an in vitro diaphragm motion reproduction system (IVDMRS) based on noninvasive and real-time ultrasound imaging to track the internal displacement of the human diaphragm and diaphragm phantoms with a respiration simulation system (RSS). An ultrasound image tracking algorithm (UITA) was used to retrieve the displacement data of the tracking target and reproduce the diaphragm motion in real time using a red laser to irradiate the diaphragm phantom in vitro. This study also recorded the respiration patterns in 10 volunteers. Both simulated and the respiration patterns in 10 human volunteers signals were input to the RSS for conducting experiments involving the reproduction of diaphragm motion in vitro using the IVDMRS. The reproduction accuracy of the IVDMRS was calculated and analyzed. The results indicate that the respiration frequency substantially affects the correlation between ultrasound and kV images, as well as the reproduction accuracy of the IVDMRS due to the system delay time (0.35s) of ultrasound imaging and signal transmission. The utilization of a phase lead compensator (PLC) reduced the error caused by this delay, thereby improving the reproduction accuracy of the IVDMRS by 14.09-46.98%. Applying the IVDMRS in clinical treatments will allow medical staff to monitor the target displacements in real time by observing the movement of the laser beam. If the target displacement moves outside the planning target volume (PTV), the treatment can be immediately stopped to ensure that healthy tissues do not receive high doses of radiation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
High Pressure Oxygen A-Band Spectra
NASA Astrophysics Data System (ADS)
Drouin, Brian; Sung, Keeyoon; Yu, Shanshan; Lunny, Elizabeth M.; Bui, Thinh Quoc; Okumura, Mitchio; Rupasinghe, Priyanka; Bray, Caitlin; Long, David A.; Hodges, Joseph; Robichaud, David; Benner, D. Chris; Devi, V. Malathy; Hoo, Jiajun
2015-06-01
Composition measurements from remote sensing platforms require knowledge of air mass to better than the desired precision of the composition. Oxygen spectra allow determination of air mass since the mixing ratio of oxygen is fixed. The OCO-2 mission is currently retrieving carbon dioxide concentration using the oxygen A-band for air mass normalization. The 0.25% accuracy desired for the carbon dioxide concentration has pushed the state-of-the-art for oxygen spectroscopy. To produce atmospheric pressure A-band cross-sections with this accuracy requires a sophisticated line-shape model (Galatry or Speed-Dependent) with line mixing (LM) and collision induced absorption (CIA). Models of each of these phenomena exist, but an integrated self-consistent model must be developed to ensure accuracy. This presentation will describe the ongoing effort to parameterize these phenomena on a representative data set created from complementary experimental techniques. The techniques include Fourier transform spectroscopy (FTS), photo-acoustic spectroscopy (PAS) and cavity ring-down spectroscopy (CRDS). CRDS data allow long-pathlength measurements with absolute intensities, providing lineshape information as well as LM and CIA, however the subtleties of the lineshape are diminished in the saturated line-centers. Conversely, the short paths and large dynamic range of the PAS data allow the full lineshape to be discerned, but with an arbitrary intensity axis. Finally, the FTS data provides intermediate paths and consistency across a broad pressure range. These spectra are all modeled with the Labfit software using first the spectral line database HITRAN, and then model values are adjusted and fitted for better agreement with the data.
Xian, George; Homer, Collin G.; Granneman, Brian; Meyer, Debra K.
2012-01-01
Remote sensing information has been widely used to monitor vegetation condition and variations in a variety of ecosystems, including shrublands. Careful application of remotely sensed imagery can provide additional spatially explicit, continuous, and extensive data on the composition and condition of shrubland ecosystems. Historically, the most widely available remote sensing information has been collected by Landsat, which has offered large spatial coverage and moderate spatial resolution data globally for nearly three decades. Such medium-resolution satellite remote sensing information can quantify the distribution and variation of terrestrial ecosystems. Landsat imagery has been frequently used with other high-resolution remote sensing data to classify sagebrush components and quantify their spatial distributions (Ramsey and others, 2004; Seefeldt and Booth, 2004; Stow and others, 2008; Underwood and others, 2007). Modeling algorithms have been developed to use field measurements and satellite remote sensing data to quantify the extent and evaluate the quality of shrub ecosystem components in large geographic areas (Homer and others, 2009). The percent cover of sagebrush ecosystem components, including bare-ground, herbaceous, litter, sagebrush, and shrub, have been quantified for entire western states (Homer and others, 2012). Furthermore, research has demonstrated the use of current measurements with historical archives of Landsat imagery to quantify the variations of these components for the last two decades (Xian and others, 2012). The modeling method used to quantify the extent and spatial distribution of sagebrush components over a large area also has required considerable amounts of training data to meet targeted accuracy requirements. These training data have maintained product accuracy by ensuring that they are derived from good quality field measurements collected during appropriate ecosystem phenology and subsequently maximized by extrapolation on high-resolution remote sensing data (Homer and others, 2012). This method has proven its utility; however, to develop these products across even larger areas will require additional cost efficiencies to ensure that an adequate product can be developed for the lowest cost possible. Given the vast geographic extent of shrubland ecosystems in the western United States, identifying cost efficiencies with optimal training data development and subsequent application to medium resolution satellite imagery provide the most likely areas for methodological efficiency gains. The primary objective of this research was to conduct a series of sensitivity tests to evaluate the most optimal and practical way to develop Landsat scale information for estimating the extent and distribution of sagebrush ecosystem components over large areas in the conterminous United States. An existing dataset of sagebrush components developed from extensive field measurements, high-resolution satellite imagery, and medium resolution Landsat imagery in Wyoming was used as the reference database (Homer and others, 2012). Statistical analysis was performed to analyze the relation between the accuracy of sagebrush components and the amount and distribution of training data on Landsat scenes needed to obtain accurate predictions.
Laser microprocessing and nanoengineering of large-area functional micro/nanostructures
NASA Astrophysics Data System (ADS)
Tang, M.; Xie, X. Z.; Yang, J.; Chen, Z. C.; Xu, L.; Choo, Y. S.; Hong, M. H.
2011-12-01
Laser microprocessing and nanoengineering are of great interest to both scientists and engineers, since the inspired properties of functional micro/nanostructures over large areas can lead to numerous unique applications. Currently laser processing systems combined with high speed automation ensure the focused laser beam to process various materials at a high throughput and a high accuracy over large working areas. UV lasers are widely used in both laser microprocessing and nanoengineering. However by improving the processing methods, green pulsed laser is capable of replacing UV lasers to make high aspect ratio micro-grooves on fragile and transparent sapphire substrates. Laser micro-texturing can also tune the wetting property of metal surfaces from hydrophilic to super-hydrophobic at a contact angle of 161° without chemical coating. Laser microlens array (MLA) can split a laser beam into multiple laser beams and reduce the laser spot size down to sub-microns. It can be applied to fabricate split ring resonator (SRR) meta-materials for THz sensing, surface plasmonic resonance (SPR) structures for NIR and molding tools for soft lithography. Furthermore, laser interference lithography combined with thermal annealing can obtain a large area of sub-50nm nano-dot clusters used for SPR applications.
NASA Astrophysics Data System (ADS)
Duan, Jiandong; Fan, Shaogui; Wu, Fengjiang; Sun, Li; Wang, Guanglin
2018-06-01
This paper proposes an instantaneous power control method for high speed permanent magnet synchronous generators (PMSG), to realize the decoupled control of active power and reactive power, through vector control based on a sliding mode observer (SMO), and a phase locked loop (PLL). Consequently, the high speed PMSG has a high internal power factor, to ensure efficient operation. Vector control and accurate estimation of the instantaneous power require an accurate estimate of the rotor position. The SMO is able to estimate the back electromotive force (EMF). The rotor position and speed can be obtained using a combination of the PLL technique and the phase compensation method. This method has the advantages of robust operation, and being resistant to noise when estimating the position of the rotor. Using instantaneous power theory, the relationship between the output active power, reactive power, and stator current of the PMSG is deduced, and the power constraint condition is analysed for operation at the unit internal power factor. Finally, the accuracy of the rotor position detection, the instantaneous power detection, and the control methods are verified using simulations and experiments.
Tamm-plasmon and surface-plasmon hybrid-mode based refractometry in photonic bandgap structures.
Das, Ritwick; Srivastava, Triranjita; Jha, Rajan
2014-02-15
The transverse magnetic (TM) polarized hybrid modes formed as a consequence of coupling between Tamm plasmon polariton (TM-TPP) mode and surface plasmon polariton (SPP) mode exhibit interesting dispersive features for realizing a highly sensitive and accurate surface plasmon resonance (SPR) sensor. We found that the TM-TPP modes, formed at the interface of distributed Bragg reflector and metal, are strongly dispersive as compared to SPP modes at optical frequencies. This causes an appreciably narrow interaction bandwidth between TM-TPP and SPP modes, which leads to highly accurate sensing. In addition, appropriate tailoring of dispersion characteristics of TM-TPP as well as SPP modes could ensure high sensitivity of a novel SPR platform. By suitably designing the Au/TiO₂/SiO₂-based geometry, we propose a TM-TPP/SPP hybrid-mode sensor and achieve a sensitivity ≥900 nm/RIU with high detection accuracy (≥30 μm⁻¹) for analyte refractive indices varying between 1.330 and 1.345 in 600-700 nm wavelength range. The possibility to achieve desired dispersive behavior in any spectral band makes the sensing configuration an extremely attractive candidate to design sensors depending on the availability of optical sources.
NASA Astrophysics Data System (ADS)
Chu, Chunlei; Stoffa, Paul L.
2012-01-01
Discrete earth models are commonly represented by uniform structured grids. In order to ensure accurate numerical description of all wave components propagating through these uniform grids, the grid size must be determined by the slowest velocity of the entire model. Consequently, high velocity areas are always oversampled, which inevitably increases the computational cost. A practical solution to this problem is to use nonuniform grids. We propose a nonuniform grid implicit spatial finite difference method which utilizes nonuniform grids to obtain high efficiency and relies on implicit operators to achieve high accuracy. We present a simple way of deriving implicit finite difference operators of arbitrary stencil widths on general nonuniform grids for the first and second derivatives and, as a demonstration example, apply these operators to the pseudo-acoustic wave equation in tilted transversely isotropic (TTI) media. We propose an efficient gridding algorithm that can be used to convert uniformly sampled models onto vertically nonuniform grids. We use a 2D TTI salt model to demonstrate its effectiveness and show that the nonuniform grid implicit spatial finite difference method can produce highly accurate seismic modeling results with enhanced efficiency, compared to uniform grid explicit finite difference implementations.
Research on Horizontal Accuracy Method of High Spatial Resolution Remotely Sensed Orthophoto Image
NASA Astrophysics Data System (ADS)
Xu, Y. M.; Zhang, J. X.; Yu, F.; Dong, S.
2018-04-01
At present, in the inspection and acceptance of high spatial resolution remotly sensed orthophoto image, the horizontal accuracy detection is testing and evaluating the accuracy of images, which mostly based on a set of testing points with the same accuracy and reliability. However, it is difficult to get a set of testing points with the same accuracy and reliability in the areas where the field measurement is difficult and the reference data with high accuracy is not enough. So it is difficult to test and evaluate the horizontal accuracy of the orthophoto image. The uncertainty of the horizontal accuracy has become a bottleneck for the application of satellite borne high-resolution remote sensing image and the scope of service expansion. Therefore, this paper proposes a new method to test the horizontal accuracy of orthophoto image. This method using the testing points with different accuracy and reliability. These points' source is high accuracy reference data and field measurement. The new method solves the horizontal accuracy detection of the orthophoto image in the difficult areas and provides the basis for providing reliable orthophoto images to the users.
Diagnostic laboratory for bleeding disorders ensures efficient management of haemorrhagic disorders.
Riddell, A; Chuansumrit, A; El-Ekiaby, M; Nair, S C
2016-07-01
Haemorrhagic disorders like Postpartum haemorrhage and Dengue haemorrhagic fever are life threatening and requires an active and efficient transfusion service that could provide the most appropriate blood product which could be effective in managing them. This would essentially require prompt identification of the coagulopathy so that the best available product can be given to the bleeding patient to correct the identified haemostatic defect which will help control the bleeding. This would only be possible if the transfusion service has a laboratory to correctly detect the haemostatic defect and that too with an accuracy and precision which is ensured by a good laboratory quality assurance practices. These same processes are necessary for the transfusion services to ensure the quality of the blood products manufactured by them and that it contains adequate amounts of haemostasis factors which will be good to be effective in the management of haemorrhagic disorders. These issues are discussed in detail individually in the management of postpartum haemorrhage and Dengue haemorrhagic fever including when these can help in the use of rFVIIa in Dengue haemorrhagic fever. The requirements to ensure good-quality blood products are made available for the management of these disorders and the same have also been described. © 2016 John Wiley & Sons Ltd.
Time assignment system and its performance aboard the Hitomi satellite
NASA Astrophysics Data System (ADS)
Terada, Yukikatsu; Yamaguchi, Sunao; Sugimoto, Shigenobu; Inoue, Taku; Nakaya, Souhei; Murakami, Maika; Yabe, Seiya; Oshimizu, Kenya; Ogawa, Mina; Dotani, Tadayasu; Ishisaki, Yoshitaka; Mizushima, Kazuyo; Kominato, Takashi; Mine, Hiroaki; Hihara, Hiroki; Iwase, Kaori; Kouzu, Tomomi; Tashiro, Makoto S.; Natsukari, Chikara; Ozaki, Masanobu; Kokubun, Motohide; Takahashi, Tadayuki; Kawakami, Satoko; Kasahara, Masaru; Kumagai, Susumu; Angelini, Lorella; Witthoeft, Michael
2018-01-01
Fast timing capability in x-ray observation of astrophysical objects is one of the key properties for the ASTRO-H (Hitomi) mission. Absolute timing accuracies of 350 or 35 μs are required to achieve nominal scientific goals or to study fast variabilities of specific sources. The satellite carries a GPS receiver to obtain accurate time information, which is distributed from the central onboard computer through the large and complex SpaceWire network. The details of the time system on the hardware and software design are described. In the distribution of the time information, the propagation delays and jitters affect the timing accuracy. Six other items identified within the timing system will also contribute to absolute time error. These error items have been measured and checked on ground to ensure the time error budgets meet the mission requirements. The overall timing performance in combination with hardware performance, software algorithm, and the orbital determination accuracies, etc. under nominal conditions satisfies the mission requirements of 35 μs. This work demonstrates key points for space-use instruments in hardware and software designs and calibration measurements for fine timing accuracy on the order of microseconds for midsized satellites using the SpaceWire (IEEE1355) network.
NASA Astrophysics Data System (ADS)
Kurniawan, Dian; Suparti; Sugito
2018-05-01
Population growth in Indonesia has increased every year. According to the population census conducted by the Central Bureau of Statistics (BPS) in 2010, the population of Indonesia has reached 237.6 million people. Therefore, to control the population growth rate, the government hold Family Planning or Keluarga Berencana (KB) program for couples of childbearing age. The purpose of this program is to improve the health of mothers and children in order to manifest prosperous society by controlling births while ensuring control of population growth. The data used in this study is the updated family data of Semarang city in 2016 that conducted by National Family Planning Coordinating Board (BKKBN). From these data, classifiers with kernel discriminant analysis will be obtained, and also classification accuracy will be obtained from that method. The result of the analysis showed that normal kernel discriminant analysis gives 71.05 % classification accuracy with 28.95 % classification error. Whereas triweight kernel discriminant analysis gives 73.68 % classification accuracy with 26.32 % classification error. Using triweight kernel discriminant for data preprocessing of family planning participation of childbearing age couples in Semarang City of 2016 can be stated better than with normal kernel discriminant.
Aerial photography flight quality assessment with GPS/INS and DEM data
NASA Astrophysics Data System (ADS)
Zhao, Haitao; Zhang, Bing; Shang, Jiali; Liu, Jiangui; Li, Dong; Chen, Yanyan; Zuo, Zhengli; Chen, Zhengchao
2018-01-01
The flight altitude, ground coverage, photo overlap, and other acquisition specifications of an aerial photography flight mission directly affect the quality and accuracy of the subsequent mapping tasks. To ensure smooth post-flight data processing and fulfill the pre-defined mapping accuracy, flight quality assessments should be carried out in time. This paper presents a novel and rigorous approach for flight quality evaluation of frame cameras with GPS/INS data and DEM, using geometric calculation rather than image analysis as in the conventional methods. This new approach is based mainly on the collinearity equations, in which the accuracy of a set of flight quality indicators is derived through a rigorous error propagation model and validated with scenario data. Theoretical analysis and practical flight test of an aerial photography mission using an UltraCamXp camera showed that the calculated photo overlap is accurate enough for flight quality assessment of 5 cm ground sample distance image, using the SRTMGL3 DEM and the POSAV510 GPS/INS data. An even better overlap accuracy could be achieved for coarser-resolution aerial photography. With this new approach, the flight quality evaluation can be conducted on site right after landing, providing accurate and timely information for decision making.
Analysis of Gopher Tortoise Population Estimation Techniques
2005-10-01
land use practices on the gopher tortoise, Gopherus polyphemus.” Biological Conservation.108: 289-298. Horngren , C.T and G. Foster. 1991. Cost ...with flagging to ensure complete coverage. A South Carolina census was conducted with a team of 60 to 70 volunteers walking abreast ( S . Bennett...best method in terms of cost and accuracy. Burke and Cox (1988) tested if the direction of tortoise tracks in the burrow was re- liable in
Measurement control workshop instructional materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbs, Philip; Harvel, Charles; Clark, John
2012-09-01
An essential element in an effective nuclear materials control and accountability (MC&A) program is the measurement of the nuclear material as it is received, moved, processed and shipped. Quality measurement systems and methodologies determine the accuracy of the accountability values. Implementation of a measurement control program is essential to ensure that the measurement systems and methodologies perform as expected. A measurement control program also allows for a determination of the level of confidence in the accounting values.