Science.gov

Sample records for adaptive metric knn

  1. Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.

    PubMed

    Mei, Gang; Xu, Nengxiong; Xu, Liangliang

    2016-01-01

    This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.

  2. Adaptable edge quality metric

    NASA Astrophysics Data System (ADS)

    Strickland, Robin N.; Chang, Dunkai K.

    1990-09-01

    A new quality metric for evaluating edges detected by digital image processing algorithms is presented. The metric is a weighted sum of measures of edge continuity smoothness thinness localization detection and noisiness. Through a training process we can design weights which optimize the metric for different users and applications. We have used the metric to compare the results of ten edge detectors when applied to edges degraded by varying degrees of blur and varying degrees and types of noise. As expected the more optimum Difference-of-Gaussians (DOG) and Haralick methods outperform the simpler gradient detectors. At high signal-to-noise (SNR) ratios Haralick''s method is the best choice although it exhibits a sudden drop in performance at lower SNRs. The DOG filter''s performance degrades almost linearly with SNR and maintains a reasonably high level at lower SNRs. The same relative performances are observed as blur is varied. For most of the detectors tested performance drops with increasing noise correlation. Noise correlated in the same direction as the edge is the most destructive of the noise types tested.

  3. Adaptive Optics Metrics & QC Scheme

    NASA Astrophysics Data System (ADS)

    Girard, Julien H.

    2017-09-01

    "There are many Adaptive Optics (AO) fed instruments on Paranal and more to come. To monitor their performances and assess the quality of the scientific data, we have developed a scheme and a set of tools and metrics adapted to each flavour of AO and each data product. Our decisions to repeat observations or not depends heavily on this immediate quality control "zero" (QC0). Atmospheric parameters monitoring can also help predict performances . At the end of the chain, the user must be able to find the data that correspond to his/her needs. In Particular, we address the special case of SPHERE."

  4. Adaptive Metric Learning for Saliency Detection.

    PubMed

    Li, Shuang; Lu, Huchuan; Lin, Zhe; Shen, Xiaohui; Price, Brian

    2015-11-01

    In this paper, we propose a novel adaptive metric learning algorithm (AML) for visual saliency detection. A key observation is that the saliency of a superpixel can be estimated by the distance from the most certain foreground and background seeds. Instead of measuring distance on the Euclidean space, we present a learning method based on two complementary Mahalanobis distance metrics: 1) generic metric learning (GML) and 2) specific metric learning (SML). GML aims at the global distribution of the whole training set, while SML considers the specific structure of a single image. Considering that multiple similarity measures from different views may enhance the relevant information and alleviate the irrelevant one, we try to fuse the GML and SML together and experimentally find the combining result does work well. Different from the most existing methods which are directly based on low-level features, we devise a superpixelwise Fisher vector coding approach to better distinguish salient objects from the background. We also propose an accurate seeds selection mechanism and exploit contextual and multiscale information when constructing the final saliency map. Experimental results on various image sets show that the proposed AML performs favorably against the state-of-the-arts.

  5. Adapting the M3 Surveillance Metrics for an Unknown Baseline

    SciTech Connect

    Hamada, Michael Scott; Abes, Jeff I.; Jaramillo, Brandon Michael Lee

    2016-11-30

    The original M3 surveillance metrics assume that the baseline is known. In this article, adapted M3 metrics are presented when the baseline is not known and estimated by available data. Deciding on how much available data is enough is also discussed.

  6. Stability and Performance Metrics for Adaptive Flight Control

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmanje; Nguyen, Nhan; VanEykeren, Luarens

    2009-01-01

    This paper addresses the problem of verifying adaptive control techniques for enabling safe flight in the presence of adverse conditions. Since the adaptive systems are non-linear by design, the existing control verification metrics are not applicable to adaptive controllers. Moreover, these systems are in general highly uncertain. Hence, the system's characteristics cannot be evaluated by relying on the available dynamical models. This necessitates the development of control verification metrics based on the system's input-output information. For this point of view, a set of metrics is introduced that compares the uncertain aircraft's input-output behavior under the action of an adaptive controller to that of a closed-loop linear reference model to be followed by the aircraft. This reference model is constructed for each specific maneuver using the exact aerodynamic and mass properties of the aircraft to meet the stability and performance requirements commonly accepted in flight control. The proposed metrics are unified in the sense that they are model independent and not restricted to any specific adaptive control methods. As an example, we present simulation results for a wing damaged generic transport aircraft with several existing adaptive controllers.

  7. Applications of image metrics in dynamic scene adaptation

    NASA Astrophysics Data System (ADS)

    Sadjadi, Firooz A.

    1992-08-01

    One of the major problems in dealing with the changes in the information contented a scene which is one of the characteristics of any dynamic scene is how adapt to these variations such that the performance of any automatic scene analyzer such as object recognizer be at its optimum. In this paper we examine the use of image and signal metrics for characterizing any scene variations and then we describe an automated system for the extraction of these quality measures and finally we will show how these metrics can be used for the automatic adaptation of an object recognition system and the resulting jump in the performance of this system.

  8. Progress in adapting k-NN methods for forest mapping and estimation using the new annual Forest Inventory and Analysis data

    Treesearch

    Reija Haapanen; Kimmo Lehtinen; Jukka Miettinen; Marvin E. Bauer; Alan R. Ek

    2002-01-01

    The k-nearest neighbor (k-NN) method has been undergoing development and testing for applications with USDA Forest Service Forest Inventory and Analysis (FIA) data in Minnesota since 1997. Research began using the 1987-1990 FIA inventory of the state, the then standard 10-point cluster plots, and Landsat TM imagery. In the past year, research has moved to examine...

  9. Improved Segmentation of White Matter Tracts with Adaptive Riemannian Metrics

    PubMed Central

    Hao, Xiang; Zygmunt, Kristen; Whitaker, Ross T.; Fletcher, P. Thomas

    2014-01-01

    We present a novel geodesic approach to segmentation of white matter tracts from diffusion tensor imaging (DTI). Compared to deterministic and stochastic tractography, geodesic approaches treat the geometry of the brain white matter as a manifold, often using the inverse tensor field as a Riemannian metric. The white matter pathways are then inferred from the resulting geodesics, which have the desirable property that they tend to follow the main eigenvectors of the tensors, yet still have the flexibility to deviate from these directions when it results in lower costs. While this makes such methods more robust to noise, the choice of Riemannian metric in these methods is ad hoc. A serious drawback of current geodesic methods is that geodesics tend to deviate from the major eigenvectors in high-curvature areas in order to achieve the shortest path. In this paper we propose a method for learning an adaptive Riemannian metric from the DTI data, where the resulting geodesics more closely follow the principal eigenvector of the diffusion tensors even in high-curvature regions. We also develop a way to automatically segment the white matter tracts based on the computed geodesics. We show the robustness of our method on simulated data with different noise levels. We also compare our method with tractography methods and geodesic approaches using other Riemannian metrics and demonstrate that the proposed method results in improved geodesics and segmentations using both synthetic and real DTI data. PMID:24211814

  10. Adaptive Riemannian Metrics for Improved Geodesic Tracking of White Matter

    PubMed Central

    Hao, Xiang; Whitaker, Ross T.; Fletcher, P. Thomas

    2011-01-01

    We present a new geodesic approach for studying white matter connectivity from diffusion tensor imaging (DTI). Previous approaches have used the inverse diffusion tensor field as a Riemannian metric and constructed white matter tracts as geodesics on the resulting manifold. These geodesics have the desirable property that they tend to follow the main eigenvectors of the tensors, yet still have the flexibility to deviate from these directions when it results in lower costs. While this makes such methods more robust to noise, it also has the serious drawback that geodesics tend to deviate from the major eigenvectors in high-curvature areas in order to achieve the shortest path. In this paper we formulate a modification of the Riemannian metric that results in geodesics adapted to follow the principal eigendirection of the tensor even in high-curvature regions. We show that this correction can be formulated as a simple scalar field modulation of the metric and that the appropriate variational problem results in a Poisson’s equation on the Riemannian manifold. We demonstrate that the proposed method results in improved geodesics using both synthetic and real DTI data. PMID:21761642

  11. Application of Bounded Linear Stability Analysis Method for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics-driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a second order system that represents a pitch attitude control of a generic transport aircraft. The analysis shows that the system with the metrics-conforming variable adaptive gain becomes more robust to unmodeled dynamics or time delay. The effect of analysis time-window for BLSA is also evaluated in order to meet the stability margin criteria.

  12. Distributed Computation of the knn Graph for Large High-Dimensional Point Sets

    PubMed Central

    Plaku, Erion; Kavraki, Lydia E.

    2009-01-01

    High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for computing knn graphs based on arbitrary distance metrics and large high-dimensional data sets increases, exceeding resources available to a single machine. In this work we efficiently distribute the computation of knn graphs for clusters of processors with message passing. Extensions to our distributed framework include the computation of graphs based on other proximity queries, such as approximate knn or range queries. Our experiments show nearly linear speedup with over one hundred processors and indicate that similar speedup can be obtained with several hundred processors. PMID:19847318

  13. Selective visual attention based clutter metric with human visual system adaptability.

    PubMed

    Zheng, Bo; Wang, Xiao-Dong; Huang, Jing-Tao; Wang, Jian; Jiang, Yang

    2016-09-20

    Most existing clutter metrics are proposed based on fixed structural features and experienced weight measures. In this paper, we identify the clutter as selective visual attention effects and propose a type of clutter metric. First, adaptive structural features are extracted from the blocks with an edge-structure similarity to the target. Next, the confusing blocks are selected by the similarity threshold based on the attention guidance map. The clutter is estimated by quantifying the effects of confusing blocks on target acquisition performance. The comparative field experiments, with a Search_2 dataset, show that the proposed metric is consistent with the adaptability of the human visual system (HVS) and outperforms other metrics.

  14. Flight Validation of a Metrics Driven L(sub 1) Adaptive Control

    NASA Technical Reports Server (NTRS)

    Dobrokhodov, Vladimir; Kitsios, Ioannis; Kaminer, Isaac; Jones, Kevin D.; Xargay, Enric; Hovakimyan, Naira; Cao, Chengyu; Lizarraga, Mariano I.; Gregory, Irene M.

    2008-01-01

    The paper addresses initial steps involved in the development and flight implementation of new metrics driven L1 adaptive flight control system. The work concentrates on (i) definition of appropriate control driven metrics that account for the control surface failures; (ii) tailoring recently developed L1 adaptive controller to the design of adaptive flight control systems that explicitly address these metrics in the presence of control surface failures and dynamic changes under adverse flight conditions; (iii) development of a flight control system for implementation of the resulting algorithms onboard of small UAV; and (iv) conducting a comprehensive flight test program that demonstrates performance of the developed adaptive control algorithms in the presence of failures. As the initial milestone the paper concentrates on the adaptive flight system setup and initial efforts addressing the ability of a commercial off-the-shelf AP with and without adaptive augmentation to recover from control surface failures.

  15. Adjustment of Adaptive Gain with Bounded Linear Stability Analysis to Improve Time-Delay Margin for Metrics-Driven Adaptive Control

    NASA Technical Reports Server (NTRS)

    Bakhtiari-Nejad, Maryam; Nguyen, Nhan T.; Krishnakumar, Kalmanje Srinvas

    2009-01-01

    This paper presents the application of Bounded Linear Stability Analysis (BLSA) method for metrics driven adaptive control. The bounded linear stability analysis method is used for analyzing stability of adaptive control models, without linearizing the adaptive laws. Metrics-driven adaptive control introduces a notion that adaptation should be driven by some stability metrics to achieve robustness. By the application of bounded linear stability analysis method the adaptive gain is adjusted during the adaptation in order to meet certain phase margin requirements. Analysis of metrics-driven adaptive control is evaluated for a linear damaged twin-engine generic transport model of aircraft. The analysis shows that the system with the adjusted adaptive gain becomes more robust to unmodeled dynamics or time delay.

  16. Stability Metrics for Simulation and Flight-Software Assessment and Monitoring of Adaptive Control Assist Compensators

    NASA Technical Reports Server (NTRS)

    Hodel, A. S.; Whorton, Mark; Zhu, J. Jim

    2008-01-01

    Due to a need for improved reliability and performance in aerospace systems, there is increased interest in the use of adaptive control or other nonlinear, time-varying control designs in aerospace vehicles. While such techniques are built on Lyapunov stability theory, they lack an accompanying set of metrics for the assessment of stability margins such as the classical gain and phase margins used in linear time-invariant systems. Such metrics must both be physically meaningful and permit the user to draw conclusions in a straightforward fashion. We present in this paper a roadmap to the development of metrics appropriate to nonlinear, time-varying systems. We also present two case studies in which frozen-time gain and phase margins incorrectly predict stability or instability. We then present a multi-resolution analysis approach that permits on-line real-time stability assessment of nonlinear systems.

  17. Hyperspectral Imagery Super-Resolution by Adaptive POCS and Blur Metric

    PubMed Central

    Hu, Shaoxing; Zhang, Shuyu; Zhang, Aiwu; Chai, Shatuo

    2017-01-01

    The spatial resolution of a hyperspectral image is often coarse as the limitations on the imaging hardware. A novel super-resolution reconstruction algorithm for hyperspectral imagery (HSI) via adaptive projection onto convex sets and image blur metric (APOCS-BM) is proposed in this paper to solve these problems. Firstly, a no-reference image blur metric assessment method based on Gabor wavelet transform is utilized to obtain the blur metric of the low-resolution (LR) image. Then, the bound used in the APOCS is automatically calculated via LR image blur metric. Finally, the high-resolution (HR) image is reconstructed by the APOCS method. With the contribution of APOCS and image blur metric, the fixed bound problem in POCS is solved, and the image blur information is utilized during the reconstruction of HR image, which effectively enhances the spatial-spectral information and improves the reconstruction accuracy. The experimental results for the PaviaU, PaviaC and Jinyin Tan datasets indicate that the proposed method not only enhances the spatial resolution, but also preserves HSI spectral information well. PMID:28054947

  18. Constrained adaptive lifting and the CAL4 metric for helicopter transmission diagnostics

    NASA Astrophysics Data System (ADS)

    Samuel, Paul D.; Pines, Darryll J.

    2009-01-01

    This paper presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting (CAL) algorithm, an adaptive manifestation of the lifting scheme. Lifting is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Adaptivity is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation synchronous signal-averaging algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have approximately the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local waveform changes associated with certain types of gear damage are poorly adapted, causing a significant change in the prediction error. A diagnostic metric based on the lifting prediction error vector termed CAL4 is developed. The CAL diagnostic algorithm is validated using data collected from the University of Maryland Transmission Test Rig and the CAL4 metric is compared with the classic metric FM4.

  19. QRS detection using K-Nearest Neighbor algorithm (KNN) and evaluation on standard ECG databases.

    PubMed

    Saini, Indu; Singh, Dilbag; Khosla, Arun

    2013-07-01

    The performance of computer aided ECG analysis depends on the precise and accurate delineation of QRS-complexes. This paper presents an application of K-Nearest Neighbor (KNN) algorithm as a classifier for detection of QRS-complex in ECG. The proposed algorithm is evaluated on two manually annotated standard databases such as CSE and MIT-BIH Arrhythmia database. In this work, a digital band-pass filter is used to reduce false detection caused by interference present in ECG signal and further gradient of the signal is used as a feature for QRS-detection. In addition the accuracy of KNN based classifier is largely dependent on the value of K and type of distance metric. The value of K = 3 and Euclidean distance metric has been proposed for the KNN classifier, using fivefold cross-validation. The detection rates of 99.89% and 99.81% are achieved for CSE and MIT-BIH databases respectively. The QRS detector obtained a sensitivity Se = 99.86% and specificity Sp = 99.86% for CSE database, and Se = 99.81% and Sp = 99.86% for MIT-BIH Arrhythmia database. A comparison is also made between proposed algorithm and other published work using CSE and MIT-BIH Arrhythmia databases. These results clearly establishes KNN algorithm for reliable and accurate QRS-detection.

  20. Quality metric in matched Laplacian of Gaussian response domain for blind adaptive optics image deconvolution

    NASA Astrophysics Data System (ADS)

    Guo, Shiping; Zhang, Rongzhi; Yang, Yikang; Xu, Rong; Liu, Changhai; Li, Jisheng

    2016-04-01

    Adaptive optics (AO) in conjunction with subsequent postprocessing techniques have obviously improved the resolution of turbulence-degraded images in ground-based astronomical observations or artificial space objects detection and identification. However, important tasks involved in AO image postprocessing, such as frame selection, stopping iterative deconvolution, and algorithm comparison, commonly need manual intervention and cannot be performed automatically due to a lack of widely agreed on image quality metrics. In this work, based on the Laplacian of Gaussian (LoG) local contrast feature detection operator, we propose a LoG domain matching operation to perceive effective and universal image quality statistics. Further, we extract two no-reference quality assessment indices in the matched LoG domain that can be used for a variety of postprocessing tasks. Three typical space object images with distinct structural features are tested to verify the consistency of the proposed metric with perceptual image quality through subjective evaluation.

  1. Adaptive imaging system using image quality metric based on statistical analysis of speckle fields

    NASA Astrophysics Data System (ADS)

    Kim, Dai Hyun; Kolesnikov, Kirill; Kostrzewski, Andrew A.; Savant, Gajendra D.; Vasiliev, Anatoly A.; Vorontsov, Mikhail A.

    2000-07-01

    This paper describes an Opto-Silicon Adaptive Imaging (OSAI) system capable of operating at low light intensities with high resolution, high accuracy, wide dynamic range, and high speed. The system consists of three major subsystems: (1) an adaptive imaging system in which a liquid crystal wavefront corrector measures image quality based on statistical analysis of a speckle field; (2) an image quality analyzer (IQA); (3) an opto-silicon multi-chip module combining a high-resolution ferroelectric liquid crystal SLM, CCD photodetector array, field-programmable gate array, and digital signal processor. The OSAI wavefront control applies adaptive optoelectronic feedback for iterative wavefront restoration and distortion compensation, suing an image quality metric based on statistical properties of the speckle field produced by moving a diffuser in the Fourier transform plane of a IQA optical system. A prototype IQA system was designed, manufactured, and tested using an input liquid crystal SLM, a Fourier lens, a light-shaping diffuser, and an output photodiode.

  2. Speckle-metric-optimization-based adaptive optics for laser beam projection and coherent beam combining.

    PubMed

    Vorontsov, Mikhail; Weyrauch, Thomas; Lachinova, Svetlana; Gatz, Micah; Carhart, Gary

    2012-07-15

    Maximization of a projected laser beam's power density at a remotely located extended object (speckle target) can be achieved by using an adaptive optics (AO) technique based on sensing and optimization of the target-return speckle field's statistical characteristics, referred to here as speckle metrics (SM). SM AO was demonstrated in a target-in-the-loop coherent beam combining experiment using a bistatic laser beam projection system composed of a coherent fiber-array transmitter and a power-in-the-bucket receiver. SM sensing utilized a 50 MHz rate dithering of the projected beam that provided a stair-mode approximation of the outgoing combined beam's wavefront tip and tilt with subaperture piston phases. Fiber-integrated phase shifters were used for both the dithering and SM optimization with stochastic parallel gradient descent control.

  3. The adaptive response metric: toward an all-hazards tool for planning, decision support, and after-action analytics.

    PubMed

    Potter, Margaret A; Schuh, Russell G; Pomer, Bruce; Stebbins, Samuel

    2013-01-01

    Local health departments are organized, resourced, and operated primarily for routine public health services. For them, responding to emergencies and disasters requires adaptation to meet the demands of an emergency, and they must reallocate or augment resources, adjust work schedules, and, depending on severity and duration of the event, even compromise routine service outputs. These adaptations occur to varying degrees regardless of the type of emergency or disaster. The Adaptive Response Metric was developed through collaboration between a number of California health departments and university-based preparedness researchers. It measures the degree of "stress" from an emergency response as experienced by local health departments at the level of functional units (eg, nursing, administration, environmental services). Pilot testing of the Adaptive Response Metric indicates its utility for emergency planning, real-time decision making, and after-action analytics.

  4. Optimal Stellar Photometry for Multi-conjugate Adaptive Optics Systems Using Science-based Metrics

    NASA Astrophysics Data System (ADS)

    Turri, P.; McConnachie, A. W.; Stetson, P. B.; Fiorentino, G.; Andersen, D. R.; Bono, G.; Massari, D.; Véran, J.-P.

    2017-04-01

    We present a detailed discussion of how to obtain precise stellar photometry in crowded fields using images from multi-conjugate adaptive optics (MCAO) systems, with the intent of informing the scientific development of this key technology for the Extremely Large Telescopes. We use deep J and K s exposures of NGC 1851 taken with the Gemini Multi-Conjugate Adaptive Optics System (GeMS) on Gemini South to quantify the performance of the instrument and to develop an optimal strategy for stellar photometry using point-spread function (PSF)-fitting techniques. We judge the success of the various methods we employ by using science-based metrics, particularly the width of the main sequence turnoff region. We also compare the GeMS photometry with the exquisite HST data in the visible of the same target. We show that the PSF produced by GeMS possesses significant spatial and temporal variability that must be accounted for during the analysis. We show that the majority of the variation of the PSF occurs within the “control radius” of the MCAO system and that the best photometry is obtained when the PSF radius is chosen to closely match this spatial scale. We identify photometric calibration as a critical issue for next-generation MCAO systems such as those on the Thirty Meter Telescope and European Extremely Large Telescope. Our final CMDs reach K s ˜ 22—below the main sequence knee—making it one of the deepest for a globular cluster available from the ground. Theoretical isochrones are in remarkable agreement with the stellar locus in our data from below the main sequence knee to the upper red giant branch.

  5. ELM-KNN for photometric redshift estimation of quasars

    NASA Astrophysics Data System (ADS)

    Zhang, Yanxia; Tu, Yang; Zhao, Yongheng; Tian, Haijun

    2017-06-01

    We explore photometric redshift estimation of quasars with the SDSS DR12 quasar sample. Firstly the quasar sample is separated into three parts according to different redshift ranges. Then three classifiers based on Extreme Learning Machine (ELM) are created in the three redshift ranges. Finally k-Nearest Neighbor (kNN) approach is applied on the three samples to predict photometric redshifts of quasars with multiwavelength photometric data. We compare the performance with different input patterns by ELM-KNN with that only by kNN. The experimental results show that ELM-KNN is feasible and superior to kNN (e.g. rms is 0.0751 vs. 0.2626 for SDSS sample), in other words, the ensemble method has the potential to increase regressor performance beyond the level reached by an individual regressor alone and will be a good choice when facing much more complex data.

  6. Strain induced giant magnetoelectric coupling in KNN/Metglas/KNN sandwich multilayers

    NASA Astrophysics Data System (ADS)

    Chitra Lekha, C. S.; Kumar, Ajith S.; Vivek, S.; Anantharaman, M. R.; Venkata Saravanan, K.; Nair, Swapna S.

    2017-01-01

    A lead-free magnetoelectric composite with sandwich layers of K0.5Na0.5NbO3 (KNN)/Co76Fe14Ni4Si5B (Metglas)/KNN is fabricated as a cantilever and it is characterized for its magnetic, ferroelectric, and magnetoelectric properties. Giant magnetoelectric (ME) coupling is recorded under both resonant and sub resonant conditions and the data are presented here. The observed magnetoelectric coupling coefficient reaches a maximum of 1321 V/cm Oe at resonance (750 Hz) and 9.5 V/cm Oe at a sub-resonant frequency of 50 Hz. The corresponding theoretical calculations are provided for comparison. High magnetostriction as well as initial permeability, fairly good piezoelectric properties, and low dielectric constant cumulatively contribute to the giant magnetoelectric properties in the present system. The high resonance and sub resonance ME coupling voltages make the system ideal for transducers and energy harvesting device applications.

  7. Complexity and Pilot Workload Metrics for the Evaluation of Adaptive Flight Controls on a Full Scale Piloted Aircraft

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Schaefer, Jacob; Burken, John J.; Larson, David; Johnson, Marcus

    2014-01-01

    Flight research has shown the effectiveness of adaptive flight controls for improving aircraft safety and performance in the presence of uncertainties. The National Aeronautics and Space Administration's (NASA)'s Integrated Resilient Aircraft Control (IRAC) project designed and conducted a series of flight experiments to study the impact of variations in adaptive controller design complexity on performance and handling qualities. A novel complexity metric was devised to compare the degrees of simplicity achieved in three variations of a model reference adaptive controller (MRAC) for NASA's F-18 (McDonnell Douglas, now The Boeing Company, Chicago, Illinois) Full-Scale Advanced Systems Testbed (Gen-2A) aircraft. The complexity measures of these controllers are also compared to that of an earlier MRAC design for NASA's Intelligent Flight Control System (IFCS) project and flown on a highly modified F-15 aircraft (McDonnell Douglas, now The Boeing Company, Chicago, Illinois). Pilot comments during the IRAC research flights pointed to the importance of workload on handling qualities ratings for failure and damage scenarios. Modifications to existing pilot aggressiveness and duty cycle metrics are presented and applied to the IRAC controllers. Finally, while adaptive controllers may alleviate the effects of failures or damage on an aircraft's handling qualities, they also have the potential to introduce annoying changes to the flight dynamics or to the operation of aircraft systems. A nuisance rating scale is presented for the categorization of nuisance side-effects of adaptive controllers.

  8. Image noise and dose performance across a clinical population: Patient size adaptation as a metric of CT performance.

    PubMed

    Ria, Francesco; Wilson, Joshua Mark; Zhang, Yakun; Samei, Ehsan

    2017-06-01

    Modern CT systems adjust X-ray flux accommodating for patient size to achieve certain image noise values. The effectiveness of this adaptation is an important aspect of CT performance and should ideally be characterized in the context of real patient cases. The objective of this study was to characterize CT performance with a new metric that includes image noise and radiation dose across a clinical patient population. The study included 1526 examinations performed by three CT scanners (one GE Healthcare Discovery CT750HD, one GE Healthcare Lightspeed VCT, and one Siemens SOMATOM definition Flash) used for two routine clinical protocols (abdominopelvic with contrast and chest without contrast). An institutional monitoring system recorded all the data involved in the study. The dose-patient size and noise-patient size dependencies were linearized by considering a first-order approximation of analytical models that describe the relationship between ionization dose and patient size, as well as image noise and patient size. A 3D-fit was performed for each protocol and each scanner with a planar function, and the root mean square error (RMSE) values were estimated as a metric of CT adaptability across the patient population. The data show different scanner dependencies in terms of adaptability: the RMSE values for the three scanners are between 0.0385 HU(1/2) and 0.0215 HU(1/2) . A theoretical relationship between image noise, CTDIvol , and patient size was determined based on real patient data. This relationship may be interpreted as a new metric related to the scanners' adaptability concerning image quality and radiation dose across a patient population. This method could be implemented to investigate the adaptability related to other image quality indexes and radiation dose in a clinical population. © 2017 American Association of Physicists in Medicine.

  9. Comparing Parafoveal Cone Photoreceptor Mosaic Metrics in Younger and Older Age Groups Using an Adaptive Optics Retinal Camera.

    PubMed

    Jacob, Julie; Paques, Michel; Krivosic, Valérie; Dupas, Bénédicte; Erginay, Ali; Tadayoni, Ramin; Gaudric, Alain

    2017-01-01

    To analyze cone mosaic metrics on adaptive optics (AO) images as a function of retinal eccentricity in two different age groups using a commercial flood illumination AO device. Fifty-three eyes of 28 healthy subjects divided into two age groups were imaged using an AO flood-illumination camera (rtx1; Imagine Eyes, Orsay, France). A 16° × 4° field was obtained horizontally. Cone-packing metrics were determined in five neighboring 50 µm × 50 µm regions. Both retinal (cones/mm(2) and µm) and visual (cones/degrees(2) and arcmin) units were computed. Results for cone mosaic metrics at 2°, 2.5°, 3°, 4°, and 5° eccentricity were compatible with previous AO scanning laser ophthalmoscopy and histology data. No significant difference was observed between the two age groups. The rtx1 camera enabled reproducible measurements of cone-packing metrics across the extrafoveal retina. These findings may contribute to the development of normative data and act as a reference for future research. [Ophthalmic Surg Lasers Imaging Retina. 2017;48:45-50.]. Copyright 2017, SLACK Incorporated.

  10. Reliability and Agreement Between Metrics of Cone Spacing in Adaptive Optics Images of the Human Retinal Photoreceptor Mosaic.

    PubMed

    Giannini, Daniela; Lombardo, Giuseppe; Mariotti, Letizia; Devaney, Nicholas; Serrao, Sebastiano; Lombardo, Marco

    2017-06-01

    To assess reliability and agreement among three metrics used to evaluate the distribution of cell distances in adaptive optics (AO) images of the cone mosaic. Using an AO flood illumination retinal camera, we acquired images of the cone mosaic in 20 healthy subjects and 12 patients with retinal diseases. The three spacing metrics studied were the center-to-center spacing (Scc), the local cone spacing (LCS), and the density recovery profile distance (DRPD). Each metric was calculated in sampling areas of different sizes (64 × 64 μm and 204 × 204 μm) across the parafovea. Both Scc and LCS were able to discriminate between healthy subjects and patients with retinal diseases; DRPD did not reliably detect any abnormality in the distribution of cell distances in patients with retinal diseases. The agreement between Scc and LCS was high in healthy subjects (intraclass correlation coefficient [ICC] ≥ 0.79) and moderate in patients with retinal diseases (ICC ≤ 0.51). The DRPD had poor agreement with Scc (ICC ≤ 0.47) and LCS (ICC ≤ 0.37). The correlation between the spacing metrics of the two sampling areas was greater in healthy subjects than in patients with retinal diseases. The Scc and LCS provided interchangeable estimates of cone distance in AO retinal images of healthy subjects but could not be used interchangeably when investigating retinal diseases with significant cell reflectivity loss (≥30%). The DRPD was unreliable for describing cell distance in a human retinal cone mosaic and did not correlate with Scc and LCS. Caution is needed when comparing spacing metrics evaluated in sampling areas of different sizes.

  11. Adaptive Training in an Unmanned Aerial Vehicel: Examination of Several Candidate Real-time Metrics

    DTIC Science & Technology

    2010-01-01

    assessed raw EEG and provided a second by second workload and engagement metric on a scale of 0-1. In addition, the Tobii X120 off- the-head eye...tracking data from blinks using the Tobii eye tracking system. Future studies will use EOG to solve this problem. Fixation data and nearest neighbors...analyses also were not possible to analyze because of too much error in the Tobii calibration. This problem has been resolved with new software

  12. Self-organised manifold learning and heuristic charting via adaptive metrics

    NASA Astrophysics Data System (ADS)

    Horvath, Denis; Ulicny, Jozef; Brutovsky, Branislav

    2016-01-01

    Classical metric and non-metric multidimensional scaling (MDS) variants represent the well-known manifold learning (ML) methods which enable construction of low-dimensional representation (projections) of high-dimensional data inputs. However, their use is limited to the cases when data are inherently reducible to low dimensionality. In general, drawbacks and limitations of these, as well as pure, MDS variants become more apparent when the exploration (learning) is exposed to the structured data of high intrinsic dimension. As we demonstrate on artificial as well as real-world datasets, the over-determination problem can be solved by means of the hybrid and multi-component discrete-continuous multi-modal optimisation heuristics. A remarkable feature of the approach is that projections onto 2D are constructed simultaneously with the data categorisation compensating in part for the loss of original input information. We observed that the optimisation module integrated with ML modelling, metric learning and categorisation leads to a nontrivial mechanism resulting in heuristic charting of data.

  13. Spectral-Spatial Hyperspectral Image Classification Based on KNN

    NASA Astrophysics Data System (ADS)

    Huang, Kunshan; Li, Shutao; Kang, Xudong; Fang, Leyuan

    2016-12-01

    Fusion of spectral and spatial information is an effective way in improving the accuracy of hyperspectral image classification. In this paper, a novel spectral-spatial hyperspectral image classification method based on K nearest neighbor (KNN) is proposed, which consists of the following steps. First, the support vector machine is adopted to obtain the initial classification probability maps which reflect the probability that each hyperspectral pixel belongs to different classes. Then, the obtained pixel-wise probability maps are refined with the proposed KNN filtering algorithm that is based on matching and averaging nonlocal neighborhoods. The proposed method does not need sophisticated segmentation and optimization strategies while still being able to make full use of the nonlocal principle of real images by using KNN, and thus, providing competitive classification with fast computation. Experiments performed on two real hyperspectral data sets show that the classification results obtained by the proposed method are comparable to several recently proposed hyperspectral image classification methods.

  14. Do kinematic metrics of walking balance adapt to perturbed optical flow?

    PubMed

    Thompson, Jessica D; Franz, Jason R

    2017-03-31

    Visual (i.e., optical flow) perturbations can be used to study balance control and balance deficits. However, it remains unclear whether walking balance control adapts to such perturbations over time. Our purpose was to investigate the propensity for visuomotor adaptation in walking balance control using prolonged exposure to optical flow perturbations. Ten subjects (age: 25.4±3.8years) walked on a treadmill while watching a speed-matched virtual hallway with and without continuous mediolateral optical flow perturbations of three different amplitudes. Each of three perturbation trials consisted of 8min of prolonged exposure followed by 1min of unperturbed walking. Using 3D motion capture, we analyzed changes in foot placement kinematics and mediolateral sacrum motion. At their onset, perturbations elicited wider and shorter steps, alluding to a more cautious, general anticipatory balance control strategy. As perturbations continued, foot placement tended toward values seen during unperturbed walking while step width variability and mediolateral sacrum motion concurrently increased. Our findings suggest that subjects progressively shifted from a general anticipatory balance control strategy to a reactive, task-specific strategy using step-to-step adjustments. Prolonged exposure to optical flow perturbations may have clinical utility to reinforce reactive, task-specific balance control through training.

  15. Preserved Network Metrics across Translated Texts

    NASA Astrophysics Data System (ADS)

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  16. Metric Education Evaluation Package.

    ERIC Educational Resources Information Center

    Kansky, Bob; And Others

    This document was developed out of a need for a complete, carefully designed set of evaluation instruments and procedures that might be applied in metric inservice programs across the nation. Components of this package were prepared in such a way as to permit local adaptation to the evaluation of a broad spectrum of metric education activities.…

  17. Metric Madness

    ERIC Educational Resources Information Center

    Kroon, Cindy D.

    2007-01-01

    Created for a Metric Day activity, Metric Madness is a board game for two to four players. Students review and practice metric vocabulary, measurement, and calculations by playing the game. Playing time is approximately twenty to thirty minutes.

  18. Comparing distance metrics for rotation using the k-nearest neighbors algorithm for entropy estimation

    PubMed Central

    Huggins, David J

    2014-01-01

    Distance metrics facilitate a number of methods for statistical analysis. For statistical mechanical applications, it is useful to be able to compute the distance between two different orientations of a molecule. However, a number of distance metrics for rotation have been employed, and in this study, we consider different distance metrics and their utility in entropy estimation using the k-nearest neighbors (KNN) algorithm. This approach shows a number of advantages over entropy estimation using a histogram method, and the different approaches are assessed using uniform randomly generated data, biased randomly generated data, and data from a molecular dynamics (MD) simulation of bulk water. The results identify quaternion metrics as superior to a metric based on the Euler angles. However, it is demonstrated that samples from MD simulation must be independent for effective use of the KNN algorithm and this finding impacts any application to time series data. PMID:24311273

  19. Comparing distance metrics for rotation using the k-nearest neighbors algorithm for entropy estimation.

    PubMed

    Huggins, David J

    2014-02-15

    Distance metrics facilitate a number of methods for statistical analysis. For statistical mechanical applications, it is useful to be able to compute the distance between two different orientations of a molecule. However, a number of distance metrics for rotation have been employed, and in this study, we consider different distance metrics and their utility in entropy estimation using the k-nearest neighbors (KNN) algorithm. This approach shows a number of advantages over entropy estimation using a histogram method, and the different approaches are assessed using uniform randomly generated data, biased randomly generated data, and data from a molecular dynamics (MD) simulation of bulk water. The results identify quaternion metrics as superior to a metric based on the Euler angles. However, it is demonstrated that samples from MD simulation must be independent for effective use of the KNN algorithm and this finding impacts any application to time series data.

  20. Synthesis and characterizations of BNT-BT-KNN ceramics for energy storage applications

    NASA Astrophysics Data System (ADS)

    Chandrasekhar, M.; Kumar, P.

    2016-09-01

    Dielectric, ferroelectric and piezoelectric properties of the (0.94-x) Bi0.5Na0.5TiO3-0.06BaTiO3-xK0.5Na0.5NbO3/BNT-BT-KNN ceramics with x = 0.02 and 0.05 (2KNN and 5KNN) were studied in detail. Dielectric study and temperature-dependent polarization hysteresis loops indicated a ferroelectric-to-antiferroelectric transition at depolarization temperature (Td). The low Td in both the ceramic samples suggested the dominant antiferroelectric ordering at room temperature (RT), which was also confirmed by RT polarization and strain hysteresis loops studies. Antiferroelectric-to-paraelectric phase transition temperature (Tm) was nearly same for both systems. The 5KNN ceramic samples showed the relaxor behaviour. The values of the dielectric constant, Td, and maximum strain percentage increased, whereas the coercive field and remnant polarization decreased with the increase of the KNN percentage in the BNT-BT-KNN system. High-energy storage density ∼0.5 J/cm3 at RT hinted about the suitability of the 5KNN system for energy storage applications.

  1. Integrating patient reported outcome measures and computerized adaptive test estimates on the same common metric: an example from the assessment of activities in rheumatoid arthritis.

    PubMed

    Doğanay Erdoğan, Beyza; Elhan, Atilla Halİl; Kaskatı, Osman Tolga; Öztuna, Derya; Küçükdeveci, Ayşe Adile; Kutlay, Şehim; Tennant, Alan

    2015-07-14

    This study aimed to explore the potential of an inclusive and fully integrated measurement system for the Activities component of the International Classification of Functioning, Disability and Health (ICF), incorporating four classical scales, including the Health Assessment Questionnaire (HAQ), and a Computerized Adaptive Testing (CAT). Three hundred patients with rheumatoid arthritis (RA) answered relevant questions from four questionnaires. Rasch analysis was performed to create an item bank using this item pool. A further 100 RA patients were recruited for a CAT application. Both real and simulated CATs were applied and the agreement between these CAT-based scores and 'paper-pencil' scores was evaluated with intraclass correlation coefficient (ICC). Anchoring strategies were used to obtain a direct translation from the item bank common metric to the HAQ score. Mean age of 300 patients was 52.3 ± 11.7 years; disease duration was 11.3 ± 8.0 years; 74.7% were women. After testing for the assumptions of Rasch analysis, a 28-item Activities item bank was created. The agreement between CAT-based scores and paper-pencil scores were high (ICC = 0.993). Using those HAQ items in the item bank as anchoring items, another Rasch analysis was performed with HAQ-8 scores as separate items together with anchoring items. Finally a conversion table of the item bank common metric to the HAQ scores was created. A fully integrated and inclusive health assessment system, illustrating the Activities component of the ICF, was built to assess RA patients. Raw score to metric conversions and vice versa were available, giving access to the metric by a simple look-up table. © 2015 Asia Pacific League of Associations for Rheumatology and Wiley Publishing Asia Pty Ltd.

  2. Color Metric.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This booklet was designed to convey metric information in pictoral form. The use of pictures in the coloring book enables the more mature person to grasp the metric message instantly, whereas the younger person, while coloring the picture, will be exposed to the metric information long enough to make the proper associations. Sheets of the booklet…

  3. Color Metric.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This booklet was designed to convey metric information in pictoral form. The use of pictures in the coloring book enables the more mature person to grasp the metric message instantly, whereas the younger person, while coloring the picture, will be exposed to the metric information long enough to make the proper associations. Sheets of the booklet…

  4. Structural and optical properties of KNN nanoparticles synthesized by a sol-gel combustion method

    NASA Astrophysics Data System (ADS)

    Khorrami, Gh. H.; Mousavi, M.; Dowran, M.

    2017-05-01

    In this paper, potassium sodium niobate (KNN) nanopowders were successfully obtained by sol-gel combustion method. According to thermogravimetric analysis (TGA) results, the produced xerogel was calcined at 500∘C, 600∘C, and 700∘C to obtain KNN powders. The structural and optical properties of the prepared powders were studied using X-ray diffraction (XRD), Fourier transform infrared (FTIR) spectroscopy, Raman spectroscopy, transmission electron microscopy (TEM), and UV-vis spectroscopy. The XRD patterns indicated formation of orthorhombic KNN samples. The Scherrer formula and size-strain plot (SSP) method were employed to calculate crystallite size and lattice strain of the KNN powders. The TEM image revealed that the average particle size of the prepared samples is about 30 nm and they have cubic shape. The optical band gap energy of the samples was calculated using UV-vis absorbance spectra of the samples along with Tauc relation.

  5. A high efficient and fast kNN algorithm based on CUDA

    NASA Astrophysics Data System (ADS)

    Pei, Tong; Zhang, Yanxia; Zhao, Yongheng

    2010-07-01

    The k Nearest Neighbor (kNN) algorithm is an effective classification approach in the statistical methods of pattern recognition. But it could be a rather time-consuming approach when applied on massive data, especially facing large survey projects in astronomy. NVIDIA CUDA is a general purpose parallel computing architecture that leverages the parallel compute engine in NVIDIA graphics processing units (GPUs) to solve many complex computational problems in a fraction of the time required on a CPU. In this paper, we implement a CUDAbased kNN algorithm, and compare its performance with CPU-only kNN algorithm using single-precision and double-precision datatype on classifying celestial objects. The results demonstrate that CUDA can speedup kNN algorithm effectively and could be useful in astronomical applications.

  6. Automatic Classification of Protein Structure Using the Maximum Contact Map Overlap Metric

    SciTech Connect

    Andonov, Rumen; Djidjev, Hristo Nikolov; Klau, Gunnar W.; Le Boudic-Jamin, Mathilde; Wohlers, Inken

    2015-10-09

    In this paper, we propose a new distance measure for comparing two protein structures based on their contact map representations. We show that our novel measure, which we refer to as the maximum contact map overlap (max-CMO) metric, satisfies all properties of a metric on the space of protein representations. Having a metric in that space allows one to avoid pairwise comparisons on the entire database and, thus, to significantly accelerate exploring the protein space compared to no-metric spaces. We show on a gold standard superfamily classification benchmark set of 6759 proteins that our exact k-nearest neighbor (k-NN) scheme classifies up to 224 out of 236 queries correctly and on a larger, extended version of the benchmark with 60; 850 additional structures, up to 1361 out of 1369 queries. Finally, our k-NN classification thus provides a promising approach for the automatic classification of protein structures based on flexible contact map overlap alignments.

  7. Fabrication of transparent lead-free KNN glass ceramics by incorporation method.

    PubMed

    Yongsiri, Ploypailin; Eitssayeam, Sukum; Rujijanagul, Gobwut; Sirisoonthorn, Somnuk; Tunkasiri, Tawee; Pengpat, Kamonpan

    2012-02-16

    The incorporation method was employed to produce potassium sodium niobate [KNN] (K0.5Na0.5NbO3) glass ceramics from the KNN-SiO2 system. This incorporation method combines a simple mixed-oxide technique for producing KNN powder and a conventional melt-quenching technique to form the resulting glass. KNN was calcined at 800°C and subsequently mixed with SiO2 in the KNN:SiO2 ratio of 75:25 (mol%). The successfully produced optically transparent glass was then subjected to a heat treatment schedule at temperatures ranging from 525°C -575°C for crystallization. All glass ceramics of more than 40% transmittance crystallized into KNN nanocrystals that were rectangular in shape and dispersed well throughout the glass matrix. The crystal size and crystallinity were found to increase with increasing heat treatment temperature, which in turn plays an important role in controlling the properties of the glass ceramics, including physical, optical, and dielectric properties. The transparency of the glass samples decreased with increasing crystal size. The maximum room temperature dielectric constant (εr) was as high as 474 at 10 kHz with an acceptable low loss (tanδ) around 0.02 at 10 kHz.

  8. Fabrication of transparent lead-free KNN glass ceramics by incorporation method

    PubMed Central

    2012-01-01

    The incorporation method was employed to produce potassium sodium niobate [KNN] (K0.5Na0.5NbO3) glass ceramics from the KNN-SiO2 system. This incorporation method combines a simple mixed-oxide technique for producing KNN powder and a conventional melt-quenching technique to form the resulting glass. KNN was calcined at 800°C and subsequently mixed with SiO2 in the KNN:SiO2 ratio of 75:25 (mol%). The successfully produced optically transparent glass was then subjected to a heat treatment schedule at temperatures ranging from 525°C -575°C for crystallization. All glass ceramics of more than 40% transmittance crystallized into KNN nanocrystals that were rectangular in shape and dispersed well throughout the glass matrix. The crystal size and crystallinity were found to increase with increasing heat treatment temperature, which in turn plays an important role in controlling the properties of the glass ceramics, including physical, optical, and dielectric properties. The transparency of the glass samples decreased with increasing crystal size. The maximum room temperature dielectric constant (εr) was as high as 474 at 10 kHz with an acceptable low loss (tanδ) around 0.02 at 10 kHz. PMID:22340426

  9. Forensic Metrics

    ERIC Educational Resources Information Center

    Bort, Nancy

    2005-01-01

    One of the most important review topics the author teaches in middle school is the use of metric measurement for problem solving and inquiry. For many years, she had students measuring various objects around the room using the tools of metric measurement. She dutifully taught hypothesizing, data collecting, and drawing conclusions. It was…

  10. Primary Metrics.

    ERIC Educational Resources Information Center

    Otto, Karen; And Others

    These 55 activity cards were created to help teachers implement a unit on metric measurement. They were designed for students aged 5 to 10, but could be used with older students. Cards are color-coded in terms of activities on basic metric terms, prefixes, length, and other measures. Both individual and small-group games and ideas are included.…

  11. Mastering Metrics

    ERIC Educational Resources Information Center

    Parrot, Annette M.

    2005-01-01

    By the time students reach a middle school science course, they are expected to make measurements using the metric system. However, most are not practiced in its use, as their experience in metrics is often limited to one unit they were taught in elementary school. This lack of knowledge is not wholly the fault of formal education. Although the…

  12. Mastering Metrics

    ERIC Educational Resources Information Center

    Parrot, Annette M.

    2005-01-01

    By the time students reach a middle school science course, they are expected to make measurements using the metric system. However, most are not practiced in its use, as their experience in metrics is often limited to one unit they were taught in elementary school. This lack of knowledge is not wholly the fault of formal education. Although the…

  13. Forensic Metrics

    ERIC Educational Resources Information Center

    Bort, Nancy

    2005-01-01

    One of the most important review topics the author teaches in middle school is the use of metric measurement for problem solving and inquiry. For many years, she had students measuring various objects around the room using the tools of metric measurement. She dutifully taught hypothesizing, data collecting, and drawing conclusions. It was…

  14. Sustainability Metrics: The San Luis Basin Project

    EPA Science Inventory

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  15. Sustainability Metrics: The San Luis Basin Project

    EPA Science Inventory

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  16. Key landscape ecology metrics for assessing climate change adaptation options: rate of change and patchiness of impacts

    USGS Publications Warehouse

    López-Hoffman, Laura; Breshears, David D.; Allen, Craig D.; Miller, Marc L.

    2013-01-01

    Under a changing climate, devising strategies to help stakeholders adapt to alterations to ecosystems and their services is of utmost importance. In western North America, diminished snowpack and river flows are causing relatively gradual, homogeneous (system-wide) changes in ecosystems and services. In addition, increased climate variability is also accelerating the incidence of abrupt and patchy disturbances such as fires, floods and droughts. This paper posits that two key variables often considered in landscape ecology—the rate of change and the degree of patchiness of change—can aid in developing climate change adaptation strategies. We use two examples from the “borderland” region of the southwestern United States and northwestern Mexico. In piñon-juniper woodland die-offs that occurred in the southwestern United States during the 2000s, ecosystem services suddenly crashed in some parts of the system while remaining unaffected in other locations. The precise timing and location of die-offs was uncertain. On the other hand, slower, homogeneous change, such as the expected declines in water supply to the Colorado River delta, will likely impact the entire ecosystem, with ecosystem services everywhere in the delta subject to alteration, and all users likely exposed. The rapidity and spatial heterogeneity of faster, patchy climate change exemplified by tree die-off suggests that decision-makers and local stakeholders would be wise to operate under a Rawlsian “veil of ignorance,” and implement adaptation strategies that allow ecosystem service users to equitably share the risk of sudden loss of ecosystem services before actual ecosystem changes occur. On the other hand, in the case of slower, homogeneous, system-wide impacts to ecosystem services as exemplified by the Colorado River delta, adaptation strategies can be implemented after the changes begin, but will require a fundamental rethinking of how ecosystems and services are used and valued. In

  17. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    PubMed

    Arefin, Ahmed Shamsul; Riveros, Carlos; Berretta, Regina; Moscato, Pablo

    2012-01-01

    The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers). An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU), can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN) search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour) for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN) provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL) at https://sourceforge.net/p/gpufsknn/.

  18. Adaptive training using an artificial neural network and EEG metrics for within- and cross-task workload classification.

    PubMed

    Baldwin, Carryl L; Penaranda, B N

    2012-01-02

    Adaptive training using neurophysiological measures requires efficient classification of mental workload in real time as a learner encounters new and increasingly difficult levels of tasks. Previous investigations have shown that artificial neural networks (ANNs) can accurately classify workload, but only when trained on neurophysiological exemplars from experienced operators on specific tasks. The present study examined classification accuracies for ANNs trained on electroencephalographic (EEG) activity recorded while participants performed the same (within task) and different (cross) tasks for short periods of time with little or no prior exposure to the tasks. Participants performed three working memory tasks at two difficulty levels with order of task and difficulty level counterbalanced. Within-task classification accuracies were high when ANNs were trained on exemplars from the same task or a set containing the to-be-classified task, (M=87.1% and 85.3%, respectively). Cross-task classification accuracies were significantly lower (average 44.8%) indicating consistent systematic misclassification for certain tasks in some individuals. Results are discussed in terms of their implications for developing neurophysiologically driven adaptive training platforms.

  19. Accelerating k-NN Algorithm with Hybrid MPI and OpenSHMEM

    SciTech Connect

    Lin, Jian; Hamidouche, Khaled; Zheng, Jie; Lu, Xiaoyi; Vishnu, Abhinav; Panda, Dhabaleswar

    2015-08-05

    Machine Learning algorithms are benefiting from the continuous improvement of programming models, including MPI, MapReduce and PGAS. k-Nearest Neighbors (k-NN) algorithm is a widely used machine learning algorithm, applied to supervised learning tasks such as classification. Several parallel implementations of k-NN have been proposed in the literature and practice. However, on high-performance computing systems with high-speed interconnects, it is important to further accelerate existing designs of the k-NN algorithm through taking advantage of scalable programming models. To improve the performance of k-NN on large-scale environment with InfiniBand network, this paper proposes several alternative hybrid MPI+OpenSHMEM designs and performs a systemic evaluation and analysis on typical workloads. The hybrid designs leverage the one-sided memory access to better overlap communication with computation than the existing pure MPI design, and propose better schemes for efficient buffer management. The implementation based on k-NN program from MaTEx with MVAPICH2-X (Unified MPI+PGAS Communication Runtime over InfiniBand) shows up to 9.0% time reduction for training KDD Cup 2010 workload over 512 cores, and 27.6% time reduction for small workload with balanced communication and computation. Experiments of running with varied number of cores show that our design can maintain good scalability.

  20. Structural, dielectric and ferroelectric study of (1-ϕ)(NBT-KNN)-ϕSBexT ceramics

    NASA Astrophysics Data System (ADS)

    Swain, Sridevi; Kumar, Pawan

    2016-11-01

    (1-ϕ)(0.93 Na0.5Bi0.5TiO3-0.07K0.5Na0.5NbO3)-ϕSr0.8Bi2.15Ta2O9/(1-ϕ)(NBT-KNN)-ϕSBexT (ϕ=0, 2, 4, 8, 12, 16 wt%) ceramic samples were synthesized by conventional solid state reaction route. Secondary phases started developing for higher SBexT content in the (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples. Decrease of transition temperature (Tm) with the increase of SBexT content in (1-ϕ)(NBT-KNN)-ϕSBexT ceramics was attributed to the increase of internal stress. Remnant polarization (Pr), leakage current density and polarization degradation values reduced with the increase of SBexT content in (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples. Retention of good ferroelectric properties and enhancement of fatigue-free behavior with the incorporation of SBexT phase in (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples suggested their usefulness for ferroelectric memory applications.

  1. Edible Metrics.

    ERIC Educational Resources Information Center

    Mecca, Christyna E.

    1998-01-01

    Presents an exercise that introduces students to scientific measurements using only metric units. At the conclusion of the exercise, students eat the experiment. Requires dried refried beans, crackers or chips, and dried instant powder for lemonade. (DDR)

  2. Think Metric

    USGS Publications Warehouse

    ,

    1978-01-01

    The International System of Units, as the metric system is officially called, provides for a single "language" to describe weights and measures over the world. We in the United States together with the people of Brunei, Burma, and Yemen are the only ones who have not put this convenient system into effect. In the passage of the Metric Conversion Act of 1975, Congress determined that we also will adopt it, but the transition will be voluntary.

  3. Image set based face recognition using self-regularized non-negative coding and adaptive distance metric learning.

    PubMed

    Mian, Ajmal; Hu, Yiqun; Hartley, Richard; Owens, Robyn

    2013-12-01

    Simple nearest neighbor classification fails to exploit the additional information in image sets. We propose self-regularized nonnegative coding to define between set distance for robust face recognition. Set distance is measured between the nearest set points (samples) that can be approximated from their orthogonal basis vectors as well as from the set samples under the respective constraints of self-regularization and nonnegativity. Self-regularization constrains the orthogonal basis vectors to be similar to the approximated nearest point. The nonnegativity constraint ensures that each nearest point is approximated from a positive linear combination of the set samples. Both constraints are formulated as a single convex optimization problem and the accelerated proximal gradient method with linear-time Euclidean projection is adapted to efficiently find the optimal nearest points between two image sets. Using the nearest points between a query set and all the gallery sets as well as the active samples used to approximate them, we learn a more discriminative Mahalanobis distance for robust face recognition. The proposed algorithm works independently of the chosen features and has been tested on gray pixel values and local binary patterns. Experiments on three standard data sets show that the proposed method consistently outperforms existing state-of-the-art methods.

  4. Efficient kNN Classification With Different Numbers of Nearest Neighbors.

    PubMed

    Zhang, Shichao; Li, Xuelong; Zong, Ming; Zhu, Xiaofeng; Wang, Ruili

    2017-04-12

    k nearest neighbor (kNN) method is a popular classification method in data mining and statistics because of its simple implementation and significant classification performance. However, it is impractical for traditional kNN methods to assign a fixed k value (even though set by experts) to all test samples. Previous solutions assign different k values to different test samples by the cross validation method but are usually time-consuming. This paper proposes a kTree method to learn different optimal k values for different test/new samples, by involving a training stage in the kNN classification. Specifically, in the training stage, kTree method first learns optimal k values for all training samples by a new sparse reconstruction model, and then constructs a decision tree (namely, kTree) using training samples and the learned optimal k values. In the test stage, the kTree fast outputs the optimal k value for each test sample, and then, the kNN classification can be conducted using the learned optimal k value and all training samples. As a result, the proposed kTree method has a similar running cost but higher classification accuracy, compared with traditional kNN methods, which assign a fixed k value to all test samples. Moreover, the proposed kTree method needs less running cost but achieves similar classification accuracy, compared with the newly kNN methods, which assign different k values to different test samples. This paper further proposes an improvement version of kTree method (namely, k*Tree method) to speed its test stage by extra storing the information of the training samples in the leaf nodes of kTree, such as the training samples located in the leaf nodes, their kNNs, and the nearest neighbor of these kNNs. We call the resulting decision tree as k*Tree, which enables to conduct kNN classification using a subset of the training samples in the leaf nodes rather than all training samples used in the newly kNN methods. This actually reduces running cost of

  5. Structural and optical properties of KNN nanocubes synthesized by a green route using gelatin

    NASA Astrophysics Data System (ADS)

    Khorrami, Gh. H.; Kompany, A.; Zak, A. Khorsand

    2015-01-01

    Sodium potassium niobate nanoparticles [(K0.5Na0.5)NbO3, KNN], KNN-NPs, were synthesized using a modified sol-gel method. Structural and optical properties of the prepared samples were investigated by thermogravometric analyzer (TGA), X-ray diffraction (XRD), transmission electron microscopy (TEM), Raman and UV-Vis spectroscopy. The XRD patterns showed that the formation of the orthorhombic KNN-NPs starts at 500°C calcination temperature. Raman spectroscopy was used to investigate the crystalline symmetry and the structural deformation of the prepared KNN-NPs. TEM images showed that the morphology of the prepared particles is cubic, with the average size of about 50 nm. From diffused reflectance spectroscopy along with using Kubelka-Munk method, the energy bandgaps were determined to be indirect with the values of 3.13 eV and 3.19 eV for the samples calcined at 500°C and 600°C, respectively.

  6. Quality metrics for product defectiveness at KCD

    SciTech Connect

    Grice, J.V.

    1993-07-01

    Metrics are discussed for measuring and tracking product defectiveness at AlliedSignal Inc., Kansas City Division (KCD). Three new metrics, the metric (percent defective) that preceded the new metrics, and several alternatives are described. The new metrics, Percent Parts Accepted, Percent Parts Accepted Trouble Free, and Defects Per Million Observations, (denoted by PPA, PATF, and DPMO, respectively) were implemented for KCD-manufactured product and purchased material in November 1992. These metrics replace the percent defective metric that had been used for several years. The PPA and PATF metrics primarily measure quality performance while DPMO measures the effects of continuous improvement activities. The new metrics measure product quality in terms of product defectiveness observed only during the inspection process. The metrics were originally developed for purchased product and were adapted to manufactured product to provide a consistent set of metrics plant- wide. The new metrics provide a meaningful tool to measure the quantity of product defectiveness in terms of the customer`s requirements and expectations for quality. Many valid metrics are available and all will have deficiencies. These three metrics are among the least sensitive to problems and are easily understood. They will serve as good management tools for KCD in the foreseeable future until new flexible data systems and reporting procedures can be implemented that can provide more detailed and accurate metric computations.

  7. Metrication and Dimensional Coordination - A Selected Bibliography.

    ERIC Educational Resources Information Center

    Clark, Roy E.; Roat Candace L.

    The United States changeover to the use of the SI (International Metric) measurement language presents the construction industry with the need to review and adapt many product standards and practices for the use of metric measurement units. These adaptations and changes can bring substantial benefits to the industry in the form of permanently…

  8. STN area detection using K-NN classifiers for MER recordings in Parkinson patients during neurostimulator implant surgery

    NASA Astrophysics Data System (ADS)

    Schiaffino, L.; Rosado Muñoz, A.; Guerrero Martínez, J.; Francés Villora, J.; Gutiérrez, A.; Martínez Torres, I.; Kohan, y. D. R.

    2016-04-01

    Deep Brain Stimulation (DBS) applies electric pulses into the subthalamic nucleus (STN) improving tremor and other symptoms associated to Parkinson’s disease. Accurate STN detection for proper location and implant of the stimulating electrodes is a complex task and surgeons are not always certain about final location. Signals from the STN acquired during DBS surgery are obtained with microelectrodes, having specific characteristics differing from other brain areas. Using supervised learning, a trained model based on previous microelectrode recordings (MER) can be obtained, being able to successfully classify the STN area for new MER signals. The K Nearest Neighbours (K-NN) algorithm has been successfully applied to STN detection. However, the use of the fuzzy form of the K-NN algorithm (KNN-F) has not been reported. This work compares the STN detection algorithm of K-NN and KNN-F. Real MER recordings from eight patients where previously classified by neurophysiologists, defining 15 features. Sensitivity and specificity for the classifiers are obtained, Wilcoxon signed rank non-parametric test is used as statistical hypothesis validation. We conclude that the performance of KNN-F classifier is higher than K-NN with p<0.01 in STN specificity.

  9. SU-E-J-124: FDG PET Metrics Analysis in the Context of An Adaptive PET Protocol for Node Positive Gynecologic Cancer Patients

    SciTech Connect

    Nawrocki, J; Chino, J; Light, K; Vergalasova, I; Craciunescu, O

    2014-06-01

    Purpose: To compare PET extracted metrics and investigate the role of a gradient-based PET segmentation tool, PET Edge (MIM Software Inc., Cleveland, OH), in the context of an adaptive PET protocol for node positive gynecologic cancer patients. Methods: An IRB approved protocol enrolled women with gynecological, PET visible malignancies. A PET-CT was obtained for treatment planning prescribed to 45–50.4Gy with a 55– 70Gy boost to the PET positive nodes. An intra-treatment PET-CT was obtained between 30–36Gy, and all volumes re-contoured. Standard uptake values (SUVmax, SUVmean, SUVmedian) and GTV volumes were extracted from the clinician contoured GTVs on the pre- and intra-treament PET-CT for primaries and nodes and compared with a two tailed Wilcoxon signed-rank test. The differences between primary and node GTV volumes contoured in the treatment planning system and those volumes generated using PET Edge were also investigated. Bland-Altman plots were used to describe significant differences between the two contouring methods. Results: Thirteen women were enrolled in this study. The median baseline/intra-treatment primary (SUVmax, mean, median) were (30.5, 9.09, 7.83)/( 16.6, 4.35, 3.74), and nodes were (20.1, 4.64, 3.93)/( 6.78, 3.13, 3.26). The p values were all < 0.001. The clinical contours were all larger than the PET Edge generated ones, with mean difference of +20.6 ml for primary, and +23.5 ml for nodes. The Bland-Altman revealed changes between clinician/PET Edge contours to be mostly within the margins of the coefficient of variability. However, there was a proportional trend, i.e. the larger the GTV, the larger the clinical contours as compared to PET Edge contours. Conclusion: Primary and node SUV values taken from the intratreament PET-CT can be used to assess the disease response and to design an adaptive plan. The PET Edge tool can streamline the contouring process and lead to smaller, less user-dependent contours.

  10. Activity Recognition in Egocentric video using SVM, kNN and Combined SVMkNN Classifiers

    NASA Astrophysics Data System (ADS)

    Sanal Kumar, K. P.; Bhavani, R., Dr.

    2017-08-01

    Egocentric vision is a unique perspective in computer vision which is human centric. The recognition of egocentric actions is a challenging task which helps in assisting elderly people, disabled patients and so on. In this work, life logging activity videos are taken as input. There are 2 categories, first one is the top level and second one is second level. Here, the recognition is done using the features like Histogram of Oriented Gradients (HOG), Motion Boundary Histogram (MBH) and Trajectory. The features are fused together and it acts as a single feature. The extracted features are reduced using Principal Component Analysis (PCA). The features that are reduced are provided as input to the classifiers like Support Vector Machine (SVM), k nearest neighbor (kNN) and combined Support Vector Machine (SVM) and k Nearest Neighbor (kNN) (combined SVMkNN). These classifiers are evaluated and the combined SVMkNN provided better results than other classifiers in the literature.

  11. The First-Principle Calculation of La-doping Effect on Piezoelectricity in Tetragonal KNN Crystal

    NASA Astrophysics Data System (ADS)

    Zhang, Qiaoli; Zhu, Jiliang; Yuan, Daqing; Zhu, Bo; Wang, Mingsong; Zhu, Xiaohong; Fan, Ping; Zuo, Yi; Zheng, Yongnan; Zhu, Shengyun

    2012-05-01

    The La-dopping effect on the piezoelectricity in the K0.5Na0.5NbO3 (KNN) crystal with a tetragonal phase is investigated for the first time using the first-principle calculation based on density functional theory. The full potentiallinearized augumented plane wave plus local orbitals (APW-LO) method and the supercell method are used in the calculation for the KNN crystal with and without the La doping. The results show that the piezoelectricity originates from the strong hybridization between the Nb atom and the O atom, and the substitution of the K or Na atom by the La impurity atom introduces the anisotropic relaxation and enhances the piezoelectricity at first and then restrains the hybridization of the Nb-O atoms when the La doping content further increases.

  12. KNN/BNT composite lead-free films for high-frequency ultrasonic transducer applications.

    PubMed

    Lau, Sien Ting; Ji, Hong Fen; Li, Xiang; Ren, Wei; Zhou, Qifa; Shung, K Kirk

    2011-01-01

    Lead-free K(0.5)Na(0.5)NbO(3)/Bi(0.5)Na(0.5)TiO(3) (KNN/ BNT) films have been fabricated by a composite sol-gel technique. Crystalline KNN fine powder was dispersed in the BNT precursor solution to form a composite slurry which was then spin-coated onto a platinum-buffered Si substrate. Repeated layering and vacuum infiltration were applied to produce 5-μm-thick dense composite film. By optimizing the sintering temperature, the films exhibited good dielectric and ferroelectric properties comparable to PZT films. A 193-MHz high-frequency ultrasonic transducer fabricated from this composite film showed a -6-dB bandwidth of approximately 34%. A tungsten wire phantom was imaged to demonstrate the capability of the transducer.

  13. A multiple-point spatially weighted k-NN method for object-based classification

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.

    2016-10-01

    Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.

  14. Lead-free piezoelectric KNN-BZ-BNT films with a vertical morphotropic phase boundary

    NASA Astrophysics Data System (ADS)

    Chen, Wen; Zhao, Jinyan; Wang, Lingyan; Ren, Wei; Liu, Ming

    2015-07-01

    The lead-free piezoelectric 0.915K0.5Na0.5NbO3-0.075BaZrO3-0.01Bi0.5Na0.5TiO3 (0.915KNN-0.075BZ-0.01BNT) films were prepared by a chemical solution deposition method. The films possess a pure rhomobohedral perovskite phase and a dense surface without crack. The temperature-dependent dielectric properties of the specimens manifest that only phase transition from ferroelectric to paraelectric phase occurred and the Curie temperature is 217 oC. The temperature stability of ferroelectric phase was also supported by the stable piezoelectric properties of the films. These results suggest that the slope of the morphotropic phase boundary (MPB) for the solid solution formed with the KNN and BZ in the films should be vertical. The voltage-induced polarization switching, and a distinct piezo-response suggested that the 0.915 KNN-0.075BZ-0.01BNT films show good piezoelectric properties.

  15. Make It Metric.

    ERIC Educational Resources Information Center

    Camilli, Thomas

    Measurement is perhaps the most frequently used form of mathematics. This book presents activities for learning about the metric system designed for upper intermediate and junior high levels. Discussions include: why metrics, history of metrics, changing to a metric world, teaching tips, and formulas. Activities presented are: metrics all around…

  16. GPU based cloud system for high-performance arrhythmia detection with parallel k-NN algorithm.

    PubMed

    Tae Joon Jun; Hyun Ji Park; Hyuk Yoo; Young-Hak Kim; Daeyoung Kim

    2016-08-01

    In this paper, we propose an GPU based Cloud system for high-performance arrhythmia detection. Pan-Tompkins algorithm is used for QRS detection and we optimized beat classification algorithm with K-Nearest Neighbor (K-NN). To support high performance beat classification on the system, we parallelized beat classification algorithm with CUDA to execute the algorithm on virtualized GPU devices on the Cloud system. MIT-BIH Arrhythmia database is used for validation of the algorithm. The system achieved about 93.5% of detection rate which is comparable to previous researches while our algorithm shows 2.5 times faster execution time compared to CPU only detection algorithm.

  17. A Comparison of the Spatial Linear Model to Nearest Neighbor (k-NN) Methods for Forestry Applications

    PubMed Central

    Ver Hoef, Jay M.; Temesgen, Hailemariam

    2013-01-01

    Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically, through simulations, and as applied to real forestry data. While both methods have desirable properties, a review shows that the SLM has prediction optimality properties, and can be quite robust. Simulations of artificial populations and resamplings of real forestry data show that the SLM has smaller empirical root-mean-squared prediction errors (RMSPE) for a wide variety of data types, with generally less bias and better interval coverage than k-NN. These patterns held for both point predictions and for population totals or averages, with the SLM reducing RMSPE from 9% to 67% over some popular k-NN methods, with SLM also more robust to spatially imbalanced sampling. Estimating prediction standard errors remains a problem for k-NN predictors, despite recent attempts using model-based methods. Our conclusions are that the SLM should generally be used rather than k-NN if the goal is accurate mapping or estimation of population totals or averages. PMID:23527110

  18. A comparison of the spatial linear model to Nearest Neighbor (k-NN) methods for forestry applications.

    PubMed

    Ver Hoef, Jay M; Temesgen, Hailemariam

    2013-01-01

    Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically, through simulations, and as applied to real forestry data. While both methods have desirable properties, a review shows that the SLM has prediction optimality properties, and can be quite robust. Simulations of artificial populations and resamplings of real forestry data show that the SLM has smaller empirical root-mean-squared prediction errors (RMSPE) for a wide variety of data types, with generally less bias and better interval coverage than k-NN. These patterns held for both point predictions and for population totals or averages, with the SLM reducing RMSPE from 9% to 67% over some popular k-NN methods, with SLM also more robust to spatially imbalanced sampling. Estimating prediction standard errors remains a problem for k-NN predictors, despite recent attempts using model-based methods. Our conclusions are that the SLM should generally be used rather than k-NN if the goal is accurate mapping or estimation of population totals or averages.

  19. NASA metrication activities

    NASA Technical Reports Server (NTRS)

    Vlannes, P. N.

    1978-01-01

    NASA's organization and policy for metrification, history from 1964, NASA participation in Federal agency activities, interaction with nongovernmental metrication organizations, and the proposed metrication assessment study are reviewed.

  20. Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance

    NASA Astrophysics Data System (ADS)

    Ruan, Yue; Xue, Xiling; Liu, Heng; Tan, Jianing; Li, Xi

    2017-08-01

    K-nearest neighbors (KNN) algorithm is a common algorithm used for classification, and also a sub-routine in various complicated machine learning tasks. In this paper, we presented a quantum algorithm (QKNN) for implementing this algorithm based on the metric of Hamming distance. We put forward a quantum circuit for computing Hamming distance between testing sample and each feature vector in the training set. Taking advantage of this method, we realized a good analog for classical KNN algorithm by setting a distance threshold value t to select k - n e a r e s t neighbors. As a result, QKNN achieves O(n 3) performance which is only relevant to the dimension of feature vectors and high classification accuracy, outperforms Llyod's algorithm (Lloyd et al. 2013) and Wiebe's algorithm (Wiebe et al. 2014).

  1. Review of metric learning with transfer learning

    NASA Astrophysics Data System (ADS)

    Pan, Jiajun

    2017-08-01

    Metric learning and Transfer Learning are all the active brunches of machine learning, which focus on different domains. Last ten years, the combine of them was obviously increasing. This review discusses the current algorithms, which combine the transfer learning and metric learning. We introduce the generalized transfer learning and the brunches with different learning sets of it, which are mostly linked to the multi-task learning and domain adaptation. For every brunch related to the metric learning task, we compare the transfer learning algorithm with the related ones in metric learning.

  2. An efficient seizure prediction method using KNN-based undersampling and linear frequency measures.

    PubMed

    Ghaderyan, Peyvand; Abbasi, Ataollah; Sedaaghi, Mohammad Hossein

    2014-07-30

    Seizure prediction based on analysis of electroencephalogram signals has generated considerable research interests. A reliable seizure prediction algorithm with minimal computational requirements is prominent issue for medical facilities; however, it has not been addressed correctly. In this study, an optimized novel method is proposed in order to remove computational complexity, and predict epileptic seizures clinically. It is based on the univariate linear features in eight frequency sub-bands. It also employs principal component analysis (PCA) for dimension reduction and optimal feature selection. Class unbalanced problem is tackled by K-nearest neighbor (KNN)-based undersampling combined with support vector machine (SVM) classifier. To find out the best results two types of postprocessing methods were studied. The proposed algorithm was evaluated on seizures and 434.9h of interictal data from 18 patients of Freiburg database. It predicted 100% of seizures with average false alarm rate of 0.13 per hour ranging between 0 and 0.39. Furthermore, G-Mean and F-measure were used for validation which were 0.97 and 0.90, respectively. These results confirmed the discriminative ability of the algorithm. In comparison with other studies, the proposed method improves trade-off between sensitivity and false prediction rate with linear features and low computational requirements and it can potentially be employed in implantable devices. Achieving high performance by linear features, PCA, KNN-based undersampling, and SVM demonstrates that this method can potentially be used in implantable devices.

  3. Automatic Classification of Protein Structure Using the Maximum Contact Map Overlap Metric

    DOE PAGES

    Andonov, Rumen; Djidjev, Hristo Nikolov; Klau, Gunnar W.; ...

    2015-10-09

    In this paper, we propose a new distance measure for comparing two protein structures based on their contact map representations. We show that our novel measure, which we refer to as the maximum contact map overlap (max-CMO) metric, satisfies all properties of a metric on the space of protein representations. Having a metric in that space allows one to avoid pairwise comparisons on the entire database and, thus, to significantly accelerate exploring the protein space compared to no-metric spaces. We show on a gold standard superfamily classification benchmark set of 6759 proteins that our exact k-nearest neighbor (k-NN) scheme classifiesmore » up to 224 out of 236 queries correctly and on a larger, extended version of the benchmark with 60; 850 additional structures, up to 1361 out of 1369 queries. Finally, our k-NN classification thus provides a promising approach for the automatic classification of protein structures based on flexible contact map overlap alignments.« less

  4. Defining Sustainability Metric Targets in an Institutional Setting

    ERIC Educational Resources Information Center

    Rauch, Jason N.; Newman, Julie

    2009-01-01

    Purpose: The purpose of this paper is to expand on the development of university and college sustainability metrics by implementing an adaptable metric target strategy. Design/methodology/approach: A combined qualitative and quantitative methodology is derived that both defines what a sustainable metric target might be and describes the path a…

  5. Defining Sustainability Metric Targets in an Institutional Setting

    ERIC Educational Resources Information Center

    Rauch, Jason N.; Newman, Julie

    2009-01-01

    Purpose: The purpose of this paper is to expand on the development of university and college sustainability metrics by implementing an adaptable metric target strategy. Design/methodology/approach: A combined qualitative and quantitative methodology is derived that both defines what a sustainable metric target might be and describes the path a…

  6. Evaluation of normalization methods for cDNA microarray data by k-NN classification

    SciTech Connect

    Wu, Wei; Xing, Eric P; Myers, Connie; Mian, Saira; Bissell, Mina J

    2004-12-17

    Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Using LOOCV error of k-NNs as the evaluation criterion, three double

  7. NASA metric transition plan

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA science publications have used the metric system of measurement since 1970. Although NASA has maintained a metric use policy since 1979, practical constraints have restricted actual use of metric units. In 1988, an amendment to the Metric Conversion Act of 1975 required the Federal Government to adopt the metric system except where impractical. In response to Public Law 100-418 and Executive Order 12770, NASA revised its metric use policy and developed this Metric Transition Plan. NASA's goal is to use the metric system for program development and functional support activities to the greatest practical extent by the end of 1995. The introduction of the metric system into new flight programs will determine the pace of the metric transition. Transition of institutional capabilities and support functions will be phased to enable use of the metric system in flight program development and operations. Externally oriented elements of this plan will introduce and actively support use of the metric system in education, public information, and small business programs. The plan also establishes a procedure for evaluating and approving waivers and exceptions to the required use of the metric system for new programs. Coordination with other Federal agencies and departments (through the Interagency Council on Metric Policy) and industry (directly and through professional societies and interest groups) will identify sources of external support and minimize duplication of effort.

  8. NASA metric transition plan

    NASA Astrophysics Data System (ADS)

    NASA science publications have used the metric system of measurement since 1970. Although NASA has maintained a metric use policy since 1979, practical constraints have restricted actual use of metric units. In 1988, an amendment to the Metric Conversion Act of 1975 required the Federal Government to adopt the metric system except where impractical. In response to Public Law 100-418 and Executive Order 12770, NASA revised its metric use policy and developed this Metric Transition Plan. NASA's goal is to use the metric system for program development and functional support activities to the greatest practical extent by the end of 1995. The introduction of the metric system into new flight programs will determine the pace of the metric transition. Transition of institutional capabilities and support functions will be phased to enable use of the metric system in flight program development and operations. Externally oriented elements of this plan will introduce and actively support use of the metric system in education, public information, and small business programs. The plan also establishes a procedure for evaluating and approving waivers and exceptions to the required use of the metric system for new programs. Coordination with other Federal agencies and departments (through the Interagency Council on Metric Policy) and industry (directly and through professional societies and interest groups) will identify sources of external support and minimize duplication of effort.

  9. Adaptive local linear regression with application to printer color management.

    PubMed

    Gupta, Maya R; Garcia, Eric K; Chin, Erika

    2008-06-01

    Local learning methods, such as local linear regression and nearest neighbor classifiers, base estimates on nearby training samples, neighbors. Usually, the number of neighbors used in estimation is fixed to be a global "optimal" value, chosen by cross validation. This paper proposes adapting the number of neighbors used for estimation to the local geometry of the data, without need for cross validation. The term enclosing neighborhood is introduced to describe a set of neighbors whose convex hull contains the test point when possible. It is proven that enclosing neighborhoods yield bounded estimation variance under some assumptions. Three such enclosing neighborhood definitions are presented: natural neighbors, natural neighbors inclusive, and enclosing k-NN. The effectiveness of these neighborhood definitions with local linear regression is tested for estimating lookup tables for color management. Significant improvements in error metrics are shown, indicating that enclosing neighborhoods may be a promising adaptive neighborhood definition for other local learning tasks as well, depending on the density of training samples.

  10. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  11. A Comparison of the Spatial Linear Model to Nearest Neighbor (k-NN) Methods for Forestry Applications

    Treesearch

    Jay M. Ver Hoef; Hailemariam Temesgen; Sergio Gómez

    2013-01-01

    Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically,...

  12. An improved k-NN method based on multiple-point statistics for classification of high-spatial resolution imagery

    NASA Astrophysics Data System (ADS)

    Tang, Y.; Jing, L.; Li, H.; Liu, Q.; Ding, H.

    2016-04-01

    In this paper, the potential of multiple-point statistics (MPS) for object-based classification is explored using a modified k-nearest neighbour (k-NN) classification method (MPk-NN). The method first utilises a training image derived from a classified map to characterise the spatial correlation between multiple points of land cover classes, overcoming the limitations of two-point geostatistical methods, and then the spatial information in the form of multiple-point probability is incorporated into the k-NN classifier. The remotely sensed image of an IKONOS subscene of the Beijing urban area was selected to evaluate the method. The image was object-based classified using the MPk-NN method and several alternatives, including the traditional k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the MPk-NN approach can achieve greater classification accuracy relative to the alternatives, which are 82.05% and 89.12% based on pixel and object testing data, respectively. Thus, the proposed method is appropriate for object-based classification.

  13. Principal component analysis (PCA)-based k-nearest neighbor (k-NN) analysis of colonic mucosal tissue fluorescence spectra.

    PubMed

    Kamath, Sudha D; Mahato, Krishna K

    2009-08-01

    The objective of this study was to verify the suitability of principal component analysis (PCA)-based k-nearest neighbor (k-NN) analysis for discriminating normal and malignant autofluorescence spectra of colonic mucosal tissues. Autofluorescence spectroscopy, a noninvasive technique, has high specificity and sensitivity for discrimination of diseased and nondiseased colonic tissues. Previously, we assessed the efficacy of the technique on colonic data using PCA Match/No match and Artificial Neural Networks (ANNs) analyses. To improve the classification reliability, the present work was conducted using PCA-based k-NN analysis and was compared with previously obtained results. A total of 115 fluorescence spectra (69 normal and 46 malignant) were recorded from 13 normal and 10 malignant colonic tissues with 325 nm pulsed laser excitation in the spectral region 350-600 nm in vitro. We applied PCA to extract the relevant information from the spectra and used a nonparametric k-NN analysis for classification. The normal and malignant spectra showed large variations in shape and intensity. Statistically significant differences were found between normal and malignant classes. The performance of the analysis was evaluated by calculating the statistical parameters specificity and sensitivity, which were found to be 100% and 91.3%, respectively. The results obtained in this study showed good discrimination between normal and malignant conditions using PCA-based k-NN analysis.

  14. (100)-Textured KNN-based thick film with enhanced piezoelectric property for intravascular ultrasound imaging

    PubMed Central

    Zhu, Benpeng; Zhang, Zhiqiang; Ma, Teng; Yang, Xiaofei; Li, Yongxiang; Shung, K. Kirk; Zhou, Qifa

    2015-01-01

    Using tape-casting technology, 35 μm free-standing (100)-textured Li doped KNN (KNLN) thick film was prepared by employing NaNbO3 (NN) as template. It exhibited similar piezoelectric behavior to lead containing materials: a longitudinal piezoelectric coefficient (d33) of ∼150 pm/V and an electromechanical coupling coefficient (kt) of 0.44. Based on this thick film, a 52 MHz side-looking miniature transducer with a bandwidth of 61.5% at −6 dB was built for Intravascular ultrasound (IVUS) imaging. In comparison with 40 MHz PMN-PT single crystal transducer, the rabbit aorta image had better resolution and higher noise-to-signal ratio, indicating that lead-free (100)-textured KNLN thick film may be suitable for IVUS (>50 MHz) imaging. PMID:25991874

  15. (100)-Textured KNN-based thick film with enhanced piezoelectric property for intravascular ultrasound imaging.

    PubMed

    Zhu, Benpeng; Zhang, Zhiqiang; Ma, Teng; Yang, Xiaofei; Li, Yongxiang; Shung, K Kirk; Zhou, Qifa

    2015-04-27

    Using tape-casting technology, 35 μm free-standing (100)-textured Li doped KNN (KNLN) thick film was prepared by employing NaNbO3 (NN) as template. It exhibited similar piezoelectric behavior to lead containing materials: a longitudinal piezoelectric coefficient (d33) of ∼150 pm/V and an electromechanical coupling coefficient (kt ) of 0.44. Based on this thick film, a 52 MHz side-looking miniature transducer with a bandwidth of 61.5% at -6 dB was built for Intravascular ultrasound (IVUS) imaging. In comparison with 40 MHz PMN-PT single crystal transducer, the rabbit aorta image had better resolution and higher noise-to-signal ratio, indicating that lead-free (100)-textured KNLN thick film may be suitable for IVUS (>50 MHz) imaging.

  16. (100)-Textured KNN-based thick film with enhanced piezoelectric property for intravascular ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Zhu, Benpeng; Zhang, Zhiqiang; Ma, Teng; Yang, Xiaofei; Li, Yongxiang; Shung, K. Kirk; Zhou, Qifa

    2015-04-01

    Using tape-casting technology, 35 μm free-standing (100)-textured Li doped KNN (KNLN) thick film was prepared by employing NaNbO3 (NN) as template. It exhibited similar piezoelectric behavior to lead containing materials: a longitudinal piezoelectric coefficient (d33) of ˜150 pm/V and an electromechanical coupling coefficient (kt) of 0.44. Based on this thick film, a 52 MHz side-looking miniature transducer with a bandwidth of 61.5% at -6 dB was built for Intravascular ultrasound (IVUS) imaging. In comparison with 40 MHz PMN-PT single crystal transducer, the rabbit aorta image had better resolution and higher noise-to-signal ratio, indicating that lead-free (100)-textured KNLN thick film may be suitable for IVUS (>50 MHz) imaging.

  17. Dielectric and piezoelectric properties of the KNN ceramic compound doped with Li, La and Ta

    NASA Astrophysics Data System (ADS)

    Fuentes, J.; Portelles, J.; Durruthy-Rodríguez, M. D.; H'Mok, H.; Raymond, O.; Heiras, J.; Cruz, M. P.; Siqueiros, J. M.

    2015-02-01

    With the purpose of improving the dielectric and piezoelectric properties of (K0.5Na0.5)NbO3 (KNN), a multiple doping strategy was tested in this research. Piezoceramics with composition [(K0.5Na0.5)0.94Li0.06]0.97La0.01(Nb0.9Ta0.1)O3 were prepared by the traditional ceramic method. The calcined powders were sintered in their own atmosphere at 1,100 °C for 1.0, 1.5 and 2.5 h. X-ray diffraction analysis showed that the Li+, La3+ and Ta5+ cations diffuse into the KNN structure to form a perovskite-structured solid solution. For 1 h sintering time, a dominant orthorhombic phase is obtained, whereas for the longer times, the dominant phase was tetragonal. The presence of a tetragonal tungsten-bronze minority second phase is confirmed. Scanning electron micrographs show rectangular-shaped grains with a mean size of 1.1 ± 0.2 μm. The existence of pores and traces of a liquid phase favoring grain growth and homogeneity is also observed. Experimental results show an enhancement of the permittivity associated with the enlargement of the c parameter of the cell that increases with sintering time. Li+ incorporation into the structure is made evident by its transition temperature at 400 °C different from those of KNNLaTi (81-110 °C) and KNNLaTa (340 °C). An analysis of the phase transition of the samples indicates a normal rather than a diffuse transition. The electromechanical parameters k p, Q m, σ p, s 11, d 31 and g 31 are determined and compared to those of commercial PZT ceramics.

  18. Dielectric and piezoelectric properties of the KNN ceramic compound doped with Li, La and Ta

    NASA Astrophysics Data System (ADS)

    Fuentes, J.; Portelles, J.; Durruthy-Rodríguez, M. D.; H'Mok, H.; Raymond, O.; Heiras, J.; Cruz, M. P.; Siqueiros, J. M.

    2014-09-01

    With the purpose of improving the dielectric and piezoelectric properties of (K0.5Na0.5)NbO3 (KNN), a multiple doping strategy was tested in this research. Piezoceramics with composition [(K0.5Na0.5)0.94Li0.06]0.97La0.01(Nb0.9Ta0.1)O3 were prepared by the traditional ceramic method. The calcined powders were sintered in their own atmosphere at 1,100 °C for 1.0, 1.5 and 2.5 h. X-ray diffraction analysis showed that the Li+, La3+ and Ta5+ cations diffuse into the KNN structure to form a perovskite-structured solid solution. For 1 h sintering time, a dominant orthorhombic phase is obtained, whereas for the longer times, the dominant phase was tetragonal. The presence of a tetragonal tungsten-bronze minority second phase is confirmed. Scanning electron micrographs show rectangular-shaped grains with a mean size of 1.1 ± 0.2 μm. The existence of pores and traces of a liquid phase favoring grain growth and homogeneity is also observed. Experimental results show an enhancement of the permittivity associated with the enlargement of the c parameter of the cell that increases with sintering time. Li+ incorporation into the structure is made evident by its transition temperature at 400 °C different from those of KNNLaTi (81-110 °C) and KNNLaTa (340 °C). An analysis of the phase transition of the samples indicates a normal rather than a diffuse transition. The electromechanical parameters k p, Q m, σ p, s 11, d 31 and g 31 are determined and compared to those of commercial PZT ceramics.

  19. Moving to Metric.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh.

    This booklet, designed to help the consumer prepare for the change to the metric system, discusses the following related topics: simplicity and universality of the metric system, weather, shopping, textiles, cooking, and driving. (MP)

  20. Metrication for the Manager.

    ERIC Educational Resources Information Center

    Benedict, John T.

    The scope of this book covers metrication management. It was created to fill the middle management need for condensed, authoritative information about the metrication process and was conceived as a working tool and a prime reference source. Written from a management point of view, it touches on virtually all aspects of metrication and highlights…

  1. Effect of bimaxillary surgery on adaptive condylar head remodeling: metric analysis and image interpretation using cone-beam computed tomography volume superimposition.

    PubMed

    Park, Soo-Byung; Yang, Yu-Mi; Kim, Yong-Il; Cho, Bong-Hae; Jung, Yun-Hoa; Hwang, Dae-Seok

    2012-08-01

    The aim of the present study was to use cone-beam computed tomography volume superimposition to investigate the effect of bimaxillary orthognathic surgery on condylar head remodeling. Using a retrospective study design, 2 investigators evaluated the cone-beam computed tomography data of subjects who had undergone Le Fort I osteotomy and mandibular setback surgery. The predictor variable was time, grouped as preoperative versus postoperative. The outcome variables were the measurement changes of the condylar heads and the distribution of the condylar head remodeling signs. Paired t and χ(2) tests were performed for the purposes of the 2-dimensional metric analysis and the condylar head remodeling distribution. P < .05 was considered significant. The sample was composed of 22 adults (11 men and 11 women, age 20.3 ± 3.2 years) diagnosed with skeletal Class III malocclusion. The intra- and interoperator reliabilities of the image interpretation showed substantial agreement, according to Cohen's kappa index. The condylar heights on the sagittal and coronal planes decreased after surgery. Bone resorption occurred predominantly in the anterior and superior areas on the sagittal plane, the superior and lateral areas on the coronal plane, and the anterolateral and posterolateral areas on the axial plane (P < .05). Bone formation was apparent only in the anteromedial area on the axial plane (P < .05). Bimaxillary orthognathic surgery caused a decrease in the condylar heights and condylar head remodeling. The cone-beam computed tomography volume superimposition method showed that the condylar head had undergone remodeling after bimaxillary surgery. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Metric learning for DNA microarray data analysis

    NASA Astrophysics Data System (ADS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-12-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  3. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  4. Algebraic mesh quality metrics

    SciTech Connect

    KNUPP,PATRICK

    2000-04-24

    Quality metrics for structured and unstructured mesh generation are placed within an algebraic framework to form a mathematical theory of mesh quality metrics. The theory, based on the Jacobian and related matrices, provides a means of constructing, classifying, and evaluating mesh quality metrics. The Jacobian matrix is factored into geometrically meaningful parts. A nodally-invariant Jacobian matrix can be defined for simplicial elements using a weight matrix derived from the Jacobian matrix of an ideal reference element. Scale and orientation-invariant algebraic mesh quality metrics are defined. the singular value decomposition is used to study relationships between metrics. Equivalence of the element condition number and mean ratio metrics is proved. Condition number is shown to measure the distance of an element to the set of degenerate elements. Algebraic measures for skew, length ratio, shape, volume, and orientation are defined abstractly, with specific examples given. Combined metrics for shape and volume, shape-volume-orientation are algebraically defined and examples of such metrics are given. Algebraic mesh quality metrics are extended to non-simplical elements. A series of numerical tests verify the theoretical properties of the metrics defined.

  5. About Using the Metric System.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This booklet contains a brief introduction to the use of the metric system. Topics covered include: (1) what is the metric system; (2) how to think metric; (3) some advantages of the metric system; (4) basics of the metric system; (5) how to measure length, area, volume, mass and temperature the metric way; (6) some simple calculations using…

  6. Optical and Piezoelectric Study of KNN Solid Solutions Co-Doped with La-Mn and Eu-Fe

    PubMed Central

    Peña-Jiménez, Jesús-Alejandro; González, Federico; López-Juárez, Rigoberto; Hernández-Alcántara, José-Manuel; Camarillo, Enrique; Murrieta-Sánchez, Héctor; Pardo, Lorena; Villafuerte-Castrejón, María-Elena

    2016-01-01

    The solid-state method was used to synthesize single phase potassium-sodium niobate (KNN) co-doped with the La3+–Mn4+ and Eu3+–Fe3+ ion pairs. Structural determination of all studied solid solutions was accomplished by XRD and Rietveld refinement method. Electron paramagnetic resonance (EPR) studies were performed to determine the oxidation state of paramagnetic centers. Optical spectroscopy measurements, excitation, emission and decay lifetime were carried out for each solid solution. The present study reveals that doping KNN with La3+–Mn4+ and Eu3+–Fe3+ at concentrations of 0.5 mol % and 1 mol %, respectively, improves the ferroelectric and piezoelectric behavior and induce the generation of optical properties in the material for potential applications. PMID:28773925

  7. Optical and Piezoelectric Study of KNN Solid Solutions Co-Doped with La-Mn and Eu-Fe.

    PubMed

    Peña-Jiménez, Jesús-Alejandro; González, Federico; López-Juárez, Rigoberto; Hernández-Alcántara, José-Manuel; Camarillo, Enrique; Murrieta-Sánchez, Héctor; Pardo, Lorena; Villafuerte-Castrejón, María-Elena

    2016-09-28

    The solid-state method was used to synthesize single phase potassium-sodium niobate (KNN) co-doped with the La(3+)-Mn(4+) and Eu(3+)-Fe(3+) ion pairs. Structural determination of all studied solid solutions was accomplished by XRD and Rietveld refinement method. Electron paramagnetic resonance (EPR) studies were performed to determine the oxidation state of paramagnetic centers. Optical spectroscopy measurements, excitation, emission and decay lifetime were carried out for each solid solution. The present study reveals that doping KNN with La(3+)-Mn(4+) and Eu(3+)-Fe(3+) at concentrations of 0.5 mol % and 1 mol %, respectively, improves the ferroelectric and piezoelectric behavior and induce the generation of optical properties in the material for potential applications.

  8. Efficiency analysis of KNN and minimum distance-based classifiers in enzyme family prediction.

    PubMed

    Nasibov, Efendi; Kandemir-Cavas, Cagin

    2009-12-01

    Nearly all enzymes are proteins. They are the biological catalysts that accelerate the function of cellular reactions. Because of different characteristics of reaction tasks, they split into six classes: oxidoreductases (EC-1), transferases (EC-2), hydrolases (EC-3), lyases (EC-4), isomerases (EC-5), ligases (EC-6). Prediction of enzyme classes is of great importance in identifying which enzyme class is a member of a protein. Since the enzyme sequences increase day by day, contrary to experimental analysis in prediction of enzyme classes for a newly found enzyme sequence, providing from data mining techniques becomes very useful and time-saving. In this paper, two kinds of simple minimum distance-based classifier methods have been proposed. These methods and known K-nearest neighbor (KNN) classification algorithm have been performed in order to classify enzymes according to their amino acid composition. Performance measurements and elapsed time to execute algorithms have been compared. In addition, equality of two proposed approaches under special condition has been proved in order to be a guide for researchers.

  9. Feature Selection and Predictors of Falls with Foot Force Sensors Using KNN-Based Algorithms

    PubMed Central

    Liang, Shengyun; Ning, Yunkun; Li, Huiqi; Wang, Lei; Mei, Zhanyong; Ma, Yingnan; Zhao, Guoru

    2015-01-01

    The aging process may lead to the degradation of lower extremity function in the elderly population, which can restrict their daily quality of life and gradually increase the fall risk. We aimed to determine whether objective measures of physical function could predict subsequent falls. Ground reaction force (GRF) data, which was quantified by sample entropy, was collected by foot force sensors. Thirty eight subjects (23 fallers and 15 non-fallers) participated in functional movement tests, including walking and sit-to-stand (STS). A feature selection algorithm was used to select relevant features to classify the elderly into two groups: at risk and not at risk of falling down, for three KNN-based classifiers: local mean-based k-nearest neighbor (LMKNN), pseudo nearest neighbor (PNN), local mean pseudo nearest neighbor (LMPNN) classification. We compared classification performances, and achieved the best results with LMPNN, with sensitivity, specificity and accuracy all 100%. Moreover, a subset of GRFs was significantly different between the two groups via Wilcoxon rank sum test, which is compatible with the classification results. This method could potentially be used by non-experts to monitor balance and the risk of falling down in the elderly population. PMID:26610503

  10. BS-KNN: An Effective Algorithm for Predicting Protein Subchloroplast Localization

    PubMed Central

    Hu, Jing; Yan, Xianghe

    2012-01-01

    Chloroplasts are organelles found in cells of green plants and eukaryotic algae that conduct photosynthesis. Knowing a protein’s subchloroplast location provides in-depth insights about the protein’s function and the microenvironment where it interacts with other molecules. In this paper, we present BS-KNN, a bit-score weighted K-nearest neighbor method for predicting proteins’ subchloroplast locations. The method makes predictions based on the bit-score weighted Euclidean distance calculated from the composition of selected pseudo-amino acids. Our method achieved 76.4% overall accuracy in assigning proteins to 4 subchloroplast locations in cross-validation. When tested on an independent set that was not seen by the method during the training and feature selection, the method achieved a consistent overall accuracy of 76.0%. The method was also applied to predict subchloroplast locations of proteins in the chloroplast proteome and validated against proteins in Arabidopsis thaliana. The software and datasets of the proposed method are available at https://edisk.fandm.edu/jing.hu/bsknn/bsknn.html. PMID:22267906

  11. Feature Combination and the kNN Framework in Object Classification.

    PubMed

    Hou, Jian; Gao, Huijun; Xia, Qi; Qi, Naiming

    2016-06-01

    In object classification, feature combination can usually be used to combine the strength of multiple complementary features and produce better classification results than any single one. While multiple kernel learning (MKL) is a popular approach to feature combination in object classification, it does not always perform well in practical applications. On one hand, the optimization process in MKL usually involves a huge consumption of computation and memory space. On the other hand, in some cases, MKL is found to perform no better than the baseline combination methods. This observation motivates us to investigate the underlying mechanism of feature combination with average combination and weighted average combination. As a result, we empirically find that in average combination, it is better to use a sample of the most powerful features instead of all, whereas in one type of weighted average combination, the best classification accuracy comes from a nearly sparse combination. We integrate these observations into the k-nearest neighbors (kNNs) framework, based on which we further discuss some issues related to sparse solution and MKL. Finally, by making use of the kNN framework, we present a new weighted average combination method, which is shown to perform better than MKL in both accuracy and efficiency in experiments. We believe that the work in this paper is helpful in exploring the mechanism underlying feature combination.

  12. Nanoscale Insight into Lead-Free BNT-BT-xKNN

    SciTech Connect

    Dittmer, Robert; Jo, Wook; Rödel, Jürgen; Kalinin, Sergei V

    2012-01-01

    Piezoresponse force microscopy (PFM) is used to afford insight into the nanoscale electromechanical behavior of lead-free piezoceramics. Materials based on Bi{sub 1/2}Na{sub 1/2}TiO{sub 3} exhibit high strains mediated by a field-induced phase transition. Using the band excitation technique the initial domain morphology, the poling behavior, the switching behavior, and the time-dependent phase stability in the pseudo-ternary system (1-x)(0.94Bi{sub 1/2}Na{sub 1/2}TiO{sub 3}-0.06BaTiO{sub 3})-xK{sub 0.5}Na{sub 0.5}NbO{sub 3} (0 {le} x {ge} 18 mol%) are revealed. In the base material (x = 0 mol%), macroscopic domains and ferroelectric switching can be induced from the initial relaxor state with sufficiently high electric field, yielding large macroscopic remanent strain and polarization. The addition of KNN increases the threshold field required to induce long range order and decreases the stability thereof. For x = 3 mol% the field-induced domains relax completely, which is also reflected in zero macroscopic remanence. Eventually, no long range order can be induced for x {ge} 3 mol%. This PFM study provides a novel perspective on the interplay between macroscopic and nanoscopic material properties in bulk lead-free piezoceramics.

  13. Dielectric properties of lead-free BZT-KNN perovskite ceramics for energy storage.

    PubMed

    Gui, Dong-Yun; Liu, Han-Xing; Hao, Hua; Sun, Yue; Cao, Ming-He; Yu, Zhi-Yong

    2011-10-17

    Lead-free (1-x)Ba(Zr₀.₁₅Ti₀.₈₅)O₃-x(K₀.₅Na₀.₅)NbO₃ ; x=0-0.05) (BZT-KNN) perovskite ceramics, a materials with potential applications for energy storage, are investigated. The samples are prepared by a solid-state reaction method. Powder X-ray diffraction (XRD) and scanning electron microscopy (SEM) are used to study the microstructure of the samples. Their dielectric properties and impedance spectra are reported as functions of temperature and frequency. The addition of 1 mol % (K₀.₅Na₀.₅)NbO₃ to Ba(Zr₀.₁₅Ti₀.₈₅)O₃ improves the dielectric constant and enhances its diffuseness in a wide temperature range. The small amount of (K₀.₅Na₀.₅)NbO₃ is found to markedly affect the microstructure of the Ba(Zr₀.₁₅Ti₀.₈₅)O₃ ceramic (grain size and other characteristics) without changing the phase or crystal symmetry. In addition, we report that fine substructures in the grains, so-called sheet structures, are responsible for the dielectric properties (both diffuseness and dielectric constant) of (1-x)Ba(Zr₀.₁₅Ti₀.₈₅)O₃-x(K₀.₅Na₀.₅)NbO₃ (x=0-0.03; especially x=0.01) ceramics. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Topics in Metric Approximation

    NASA Astrophysics Data System (ADS)

    Leeb, William Edward

    This thesis develops effective approximations of certain metrics that occur frequently in pure and applied mathematics. We show that distances that often arise in applications, such as the Earth Mover's Distance between two probability measures, can be approximated by easily computed formulas for a wide variety of ground distances. We develop simple and easily computed characterizations both of norms measuring a function's regularity -- such as the Lipschitz norm -- and of their duals. We are particularly concerned with the tensor product of metric spaces, where the natural notion of regularity is not the Lipschitz condition but the mixed Lipschitz condition. A theme that runs throughout this thesis is that snowflake metrics (metrics raised to a power less than 1) are often better-behaved than ordinary metrics. For example, we show that snowflake metrics on finite spaces can be approximated by the average of tree metrics with a distortion bounded by intrinsic geometric characteristics of the space and not the number of points. Many of the metrics for which we characterize the Lipschitz space and its dual are snowflake metrics. We also present applications of the characterization of certain regularity norms to the problem of recovering a matrix that has been corrupted by noise. We are able to achieve an optimal rate of recovery for certain families of matrices by exploiting the relationship between mixed-variable regularity conditions and the decay of a function's coefficients in a certain orthonormal basis.

  15. Robust Transfer Metric Learning for Image Classification.

    PubMed

    Ding, Zhengming; Fu, Yun

    2017-02-01

    Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.

  16. Metrics for Blueprint Reading.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of blueprint reading students, this instructional package is one of eight for the manufacturing occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology, measurement…

  17. Metrics for Food Distribution.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in food distribution, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  18. Metrics for Transportation.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in transportation, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  19. Surveillance Metrics Sensitivity Study

    SciTech Connect

    Bierbaum, R; Hamada, M; Robertson, A

    2011-11-01

    In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

  20. Surveillance metrics sensitivity study.

    SciTech Connect

    Hamada, Michael S.; Bierbaum, Rene Lynn; Robertson, Alix A.

    2011-09-01

    In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

  1. Arbitrary Metrics in Psychology

    ERIC Educational Resources Information Center

    Blanton, Hart; Jaccard, James

    2006-01-01

    Many psychological tests have arbitrary metrics but are appropriate for testing psychological theories. Metric arbitrariness is a concern, however, when researchers wish to draw inferences about the true, absolute standing of a group or individual on the latent psychological dimension being measured. The authors illustrate this in the context of 2…

  2. Metrics for Cosmetology.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of cosmetology students, this instructional package on cosmetology is part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology, measurement terms, and tools currently in use. Each of the…

  3. Metrics for Aviation Electronics.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of aviation electronics students, this instructional package is one of four for the transportation occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  4. Arbitrary Metrics in Psychology

    ERIC Educational Resources Information Center

    Blanton, Hart; Jaccard, James

    2006-01-01

    Many psychological tests have arbitrary metrics but are appropriate for testing psychological theories. Metric arbitrariness is a concern, however, when researchers wish to draw inferences about the true, absolute standing of a group or individual on the latent psychological dimension being measured. The authors illustrate this in the context of 2…

  5. Metrics for Dental Assistants.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in becoming dental assistants, this instructional package is one of five for the health occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  6. A Measured Metric Statement.

    ERIC Educational Resources Information Center

    Gaughan, Edward D.; Wisner, Robert J.

    1981-01-01

    A middle-road approach towards adopting the instruction of the metric system is presented. The realities of our cultural, economic, and political processes are taken into account and a 100 percent metric curriculum is viewed as unrealistic and anachronistic. (MP)

  7. Metrics for Fire Service.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in fire science education, this instructional package is one of two for the public service occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  8. The Science News Metrics

    NASA Astrophysics Data System (ADS)

    Christian, Carol A.; Davidson, Greg

    2006-01-01

    Scientists, observatories, academic institutions and funding agencies persistently review the usefulness and productivity of investment in scientific research. The Science News Metrics was created over 10 years ago to review NASA's performance in this arena. The metric has been useful for many years as one facet in measuring the scientific discovery productivity of NASA-funded missions. The metric is computed independently of the agency and has been compiled in a consistent manner. Examination of the metric yields year-by-year insight into NASA science successes in a world wide context. The metric has shown that NASA's contribution to worldwide top science news stories has been approximately 5% overall with the Hubble Space Telescope dominating the performance.

  9. Holographic Spherically Symmetric Metrics

    NASA Astrophysics Data System (ADS)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  10. Sustainability Indicators and Metrics

    EPA Science Inventory

    Sustainability is about preserving human existence. Indicators and metrics are absolutely necessary to provide at least a semi-quantitative assessment of progress towards or away from sustainability. Otherwise, it becomes impossible to objectively assess whether progress is bei...

  11. An Arithmetic Metric

    ERIC Educational Resources Information Center

    Dominici, Diego

    2011-01-01

    This work introduces a distance between natural numbers not based on their position on the real line but on their arithmetic properties. We prove some metric properties of this distance and consider a possible extension.

  12. General Motors Goes Metric

    ERIC Educational Resources Information Center

    Webb, Ted

    1976-01-01

    Describes the program to convert to the metric system all of General Motors Corporation products. Steps include establishing policy regarding employee-owned tools, setting up training plans, and making arrangements with suppliers. (MF)

  13. Sustainability Indicators and Metrics

    EPA Science Inventory

    Sustainability is about preserving human existence. Indicators and metrics are absolutely necessary to provide at least a semi-quantitative assessment of progress towards or away from sustainability. Otherwise, it becomes impossible to objectively assess whether progress is bei...

  14. An Arithmetic Metric

    ERIC Educational Resources Information Center

    Dominici, Diego

    2011-01-01

    This work introduces a distance between natural numbers not based on their position on the real line but on their arithmetic properties. We prove some metric properties of this distance and consider a possible extension.

  15. A metric for success

    NASA Astrophysics Data System (ADS)

    Carver, Gary P.

    1994-05-01

    The federal agencies are working with industry to ease adoption of the metric system. The goal is to help U.S. industry compete more successfully in the global marketplace, increase exports, and create new jobs. The strategy is to use federal procurement, financial assistance, and other business-related activities to encourage voluntary conversion. Based upon the positive experiences of firms and industries that have converted, federal agencies have concluded that metric use will yield long-term benefits that are beyond any one-time costs or inconveniences. It may be time for additional steps to move the Nation out of its dual-system comfort zone and continue to progress toward metrication. This report includes 'Metric Highlights in U.S. History'.

  16. Enterprise Sustainment Metrics

    DTIC Science & Technology

    The Air Force sustainment enterprise does not have metrics that . . . adequately measure key sustainment parameters, according to the 2011 National...Research Council of the National Academies study, Examination of the U.S. Air Force’s Aircraft Sustainment Needs in the Future and Its Strategy to Meet...standardized and do not contribute to the overall assessment of the sustainment enterprise. This paper explores the development of a single metric

  17. METRICS DEVELOPMENT FOR PATENTS.

    PubMed

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  18. Analyzing the defect structure of CuO-doped PZT and KNN piezoelectrics from electron paramagnetic resonance.

    PubMed

    Jakes, Peter; Kungl, Hans; Schierholz, Roland; Eichel, Rüdiger-A

    2014-09-01

    The defect structure for copper-doped sodium potassium niobate (KNN) ferroelectrics has been analyzed with respect to its defect structure. In particular, the interplay between the mutually compensating dimeric (Cu(Nb)'''-V(O)··) and trimeric (V(O)··-Cu(Nb)'''-V(O)··)· defect complexes with 180° and non-180° domain walls has been analyzed and compared to the effects from (Cu'' - V(O)··)(x)× dipoles in CuO-doped lead zirconate titanate (PZT). Attempts are made to relate the rearrangement of defect complexes to macroscopic electromechanical properties.

  19. Fault Management Metrics

    NASA Technical Reports Server (NTRS)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  20. Successful Experiences in Teaching Metric.

    ERIC Educational Resources Information Center

    Odom, Jeffrey V., Ed.

    In this publication are presentations on specific experiences in teaching metrics, made at a National Bureau of Standards conference. Ideas of value to teachers and administrators are described in reports on: SI units of measure; principles and practices of teaching metric; metric and the school librarian; teaching metric through television and…

  1. Changing to the Metric System.

    ERIC Educational Resources Information Center

    Chambers, Donald L.; Dowling, Kenneth W.

    This report examines educational aspects of the conversion to the metric system of measurement in the United States. Statements of positions on metrication and basic mathematical skills are given from various groups. Base units, symbols, prefixes, and style of the metric system are outlined. Guidelines for teaching metric concepts are given,…

  2. Using Metrics in Industrial Arts.

    ERIC Educational Resources Information Center

    Bame, E. Allen

    This metric supplement is intended as a guide to aid the industrial arts teacher in incorporating metrics in the classroom. A list of student objectives for measurement skills is followed by an overview of the history of measurement, an argument for change to the metric system in the United States, and a discussion of metric basics (common terms).…

  3. Multi-color space threshold segmentation and self-learning k-NN algorithm for surge test EUT status identification

    NASA Astrophysics Data System (ADS)

    Huang, Jian; Liu, Gui-xiong

    2016-09-01

    The identification of targets varies in different surge tests. A multi-color space threshold segmentation and self-learning k-nearest neighbor algorithm ( k-NN) for equipment under test status identification was proposed after using feature matching to identify equipment status had to train new patterns every time before testing. First, color space (L*a*b*, hue saturation lightness (HSL), hue saturation value (HSV)) to segment was selected according to the high luminance points ratio and white luminance points ratio of the image. Second, the unknown class sample S r was classified by the k-NN algorithm with training set T z according to the feature vector, which was formed from number of pixels, eccentricity ratio, compactness ratio, and Euler's numbers. Last, while the classification confidence coefficient equaled k, made S r as one sample of pre-training set T z '. The training set T z increased to T z+1 by T z ' if T z ' was saturated. In nine series of illuminant, indicator light, screen, and disturbances samples (a total of 21600 frames), the algorithm had a 98.65%identification accuracy, also selected five groups of samples to enlarge the training set from T 0 to T 5 by itself.

  4. Dynamic partial reconfiguration implementation of the SVM/KNN multi-classifier on FPGA for bioinformatics application.

    PubMed

    Hussain, Hanaa M; Benkrid, Khaled; Seker, Huseyin

    2015-01-01

    Bioinformatics data tend to be highly dimensional in nature thus impose significant computational demands. To resolve limitations of conventional computing methods, several alternative high performance computing solutions have been proposed by scientists such as Graphical Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The latter have shown to be efficient and high in performance. In recent years, FPGAs have been benefiting from dynamic partial reconfiguration (DPR) feature for adding flexibility to alter specific regions within the chip. This work proposes combing the use of FPGAs and DPR to build a dynamic multi-classifier architecture that can be used in processing bioinformatics data. In bioinformatics, applying different classification algorithms to the same dataset is desirable in order to obtain comparable, more reliable and consensus decision, but it can consume long time when performed on conventional PC. The DPR implementation of two common classifiers, namely support vector machines (SVMs) and K-nearest neighbor (KNN) are combined together to form a multi-classifier FPGA architecture which can utilize specific region of the FPGA to work as either SVM or KNN classifier. This multi-classifier DPR implementation achieved at least ~8x reduction in reconfiguration time over the single non-DPR classifier implementation, and occupied less space and hardware resources than having both classifiers. The proposed architecture can be extended to work as an ensemble classifier.

  5. KNN-based local linear regression for the analysis and simulation of low flow extremes under climatic influence

    NASA Astrophysics Data System (ADS)

    Lee, Taesam; Ouarda, Taha B. M. J.; Yoon, Sunkwon

    2017-02-01

    Climate change frequently causes highly nonlinear and irregular behaviors in hydroclimatic systems. The stochastic simulation of hydroclimatic variables reproduces such irregular behaviors and is beneficial for assessing their impact on other regimes. The objective of the current study is to propose a novel method, a k-nearest neighbor (KNN) based on the local linear regression method (KLR), to reproduce nonlinear and heteroscedastic relations in hydroclimatic variables. The proposed model was validated with a nonlinear, heteroscedastic, lag-1 time dependent test function. The validation results of the test function show that the key statistics, nonlinear dependence, and heteroscedascity of the test data are reproduced well by the KLR model. In contrast, a traditional resampling technique, KNN resampling (KNNR), shows some biases with respect to key statistics, such as the variance and lag-1 correlation. Furthermore, the proposed KLR model was used to simulate the annual minimum of the consecutive 7-day average daily mean flow (Min7D) of the Romaine River, Quebec. The observed and extended North Atlantic Oscillation (NAO) index is incorporated into the model. The case study results of the observed period illustrate that the KLR model sufficiently reproduced key statistics and the nonlinear heteroscedasticity relation. For the future period, a lower mean is observed, which indicates that drier conditions other than normal might be expected in the next decade in the Romaine River. Overall, it is concluded that the KLR model can be a good alternative for simulating irregular and nonlinear behaviors in hydroclimatic variables.

  6. Cyber threat metrics.

    SciTech Connect

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  7. Evaluation Metrics for Biostatistical and Epidemiological Collaborations

    PubMed Central

    Rubio, Doris McGartland; del Junco, Deborah J.; Bhore, Rafia; Lindsell, Christopher J.; Oster, Robert A.; Wittkowski, Knut M.; Welty, Leah J.; Li, Yi-Ju; DeMets, Dave

    2011-01-01

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. PMID:21284015

  8. Metrics of Scholarly Impact

    ERIC Educational Resources Information Center

    Cacioppo, John T.; Cacioppo, Stephanie

    2012-01-01

    Ruscio and colleagues (Ruscio, Seaman, D'Oriano, Stremlo, & Mahalchik, this issue) provide a thoughtful empirical analysis of 22 different measures of individual scholarly impact. The simplest metric is number of publications, which Simonton (1997) found to be a reasonable predictor of career trajectories. Although the assessment of the scholarly…

  9. Metrical Phonology and SLA.

    ERIC Educational Resources Information Center

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English language with the intention that it may be used in second language instruction. Stress is defined by its physical and acoustical correlates, and the principles of…

  10. Metrics and Sports.

    ERIC Educational Resources Information Center

    National Collegiate Athletic Association, Shawnee Mission, KS.

    Designed as a guide to aid the National Collegiate Athletic Association membership and others who must relate measurement of distances, weights, and volumes to athletic activity, this document presents diagrams of performance areas with measurements delineated in both imperial and metric terms. Illustrations are given for baseball, basketball,…

  11. Engineering performance metrics

    NASA Astrophysics Data System (ADS)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  12. Arbitrary Metrics Redux

    ERIC Educational Resources Information Center

    Blanton, Hart; Jaccard, James

    2006-01-01

    Reducing the arbitrariness of a metric is distinct from the pursuit of validity, rational zero points, data transformations, standardization, and the types of statistical procedures one uses to analyze interval-level versus ordinal-level data. A variety of theoretical, methodological, and statistical tools can assist researchers who wish to make…

  13. Metrics of Scholarly Impact

    ERIC Educational Resources Information Center

    Cacioppo, John T.; Cacioppo, Stephanie

    2012-01-01

    Ruscio and colleagues (Ruscio, Seaman, D'Oriano, Stremlo, & Mahalchik, this issue) provide a thoughtful empirical analysis of 22 different measures of individual scholarly impact. The simplest metric is number of publications, which Simonton (1997) found to be a reasonable predictor of career trajectories. Although the assessment of the scholarly…

  14. Software Quality Metrics

    DTIC Science & Technology

    1991-07-01

    March 1979, pp. 121-128. Gorla, Narasimhaiah, Alan C. Benander, and Barbara A. Benander, "Debugging Effort Estimation Using Software Metrics", IEEE...Society, IEEE Guide for the Use of IEEE Standard Dictionary of Measures to Produce Reliable Software, IEEE Std 982.2-1988, June 1989. Jones, Capers

  15. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  16. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  17. Predicting Subcellular Localization of Apoptosis Proteins Combining GO Features of Homologous Proteins and Distance Weighted KNN Classifier

    PubMed Central

    Wang, Xiao; Li, Hui; Zhang, Qiuwen; Wang, Rong

    2016-01-01

    Apoptosis proteins play a key role in maintaining the stability of organism; the functions of apoptosis proteins are related to their subcellular locations which are used to understand the mechanism of programmed cell death. In this paper, we utilize GO annotation information of apoptosis proteins and their homologous proteins retrieved from GOA database to formulate feature vectors and then combine the distance weighted KNN classification algorithm with them to solve the data imbalance problem existing in CL317 data set to predict subcellular locations of apoptosis proteins. It is found that the number of homologous proteins can affect the overall prediction accuracy. Under the optimal number of homologous proteins, the overall prediction accuracy of our method on CL317 data set reaches 96.8% by Jackknife test. Compared with other existing methods, it shows that our proposed method is very effective and better than others for predicting subcellular localization of apoptosis proteins. PMID:27213149

  18. Accounting for dependence induced by weighted KNN imputation in paired samples, motivated by a colorectal cancer study.

    PubMed

    Suyundikov, Anvar; Stevens, John R; Corcoran, Christopher; Herrick, Jennifer; Wolff, Roger K; Slattery, Martha L

    2015-01-01

    Missing data can arise in bioinformatics applications for a variety of reasons, and imputation methods are frequently applied to such data. We are motivated by a colorectal cancer study where miRNA expression was measured in paired tumor-normal samples of hundreds of patients, but data for many normal samples were missing due to lack of tissue availability. We compare the precision and power performance of several imputation methods, and draw attention to the statistical dependence induced by K-Nearest Neighbors (KNN) imputation. This imputation-induced dependence has not previously been addressed in the literature. We demonstrate how to account for this dependence, and show through simulation how the choice to ignore or account for this dependence affects both power and type I error rate control.

  19. A Monitoring and Advisory System for Diabetes Patient Management Using a Rule-Based Method and KNN

    PubMed Central

    Lee, Malrey; Gatton, Thomas M.; Lee, Keun-Kwang

    2010-01-01

    Diabetes is difficult to control and it is important to manage the diabetic’s blood sugar level and prevent the associated complications by appropriate diabetic treatment. This paper proposes a system that can provide appropriate management for diabetes patients, according to their blood sugar level. The system is designed to send the information about the blood sugar levels, blood pressure, food consumption, exercise, etc., of diabetes patients, and manage the treatment by recommending and monitoring food consumption, physical activity, insulin dosage, etc., so that the patient can better manage their condition. The system is based on rules and the K Nearest Neighbor (KNN) classifier algorithm, to obtain the optimum treatment recommendation. Also, a monitoring system for diabetes patients is implemented using Web Services and Personal Digital Assistant (PDA) programming. PMID:22319334

  20. Optimal Detection Range of RFID Tag for RFID-based Positioning System Using the k-NN Algorithm.

    PubMed

    Han, Soohee; Kim, Junghwan; Park, Choung-Hwan; Yoon, Hee-Cheon; Heo, Joon

    2009-01-01

    Positioning technology to track a moving object is an important and essential component of ubiquitous computing environments and applications. An RFID-based positioning system using the k-nearest neighbor (k-NN) algorithm can determine the position of a moving reader from observed reference data. In this study, the optimal detection range of an RFID-based positioning system was determined on the principle that tag spacing can be derived from the detection range. It was assumed that reference tags without signal strength information are regularly distributed in 1-, 2- and 3-dimensional spaces. The optimal detection range was determined, through analytical and numerical approaches, to be 125% of the tag-spacing distance in 1-dimensional space. Through numerical approaches, the range was 134% in 2-dimensional space, 143% in 3-dimensional space.

  1. Metric Measurement and Instructional Television

    ERIC Educational Resources Information Center

    Meiring, Steven P.

    1977-01-01

    The television series "MeasureMetric," an instructional series introducing length, area, volume, mass, and temperature measurement in metric settings, is described. Guidelines are given for using the series as a complete learning unit. (JT)

  2. Quality Metrics in Endoscopy

    PubMed Central

    Gurudu, Suryakanth R.

    2013-01-01

    Endoscopy has evolved in the past 4 decades to become an important tool in the diagnosis and management of many digestive diseases. Greater focus on endoscopic quality has highlighted the need to ensure competency among endoscopists. A joint task force of the American College of Gastroenterology and the American Society for Gastrointestinal Endoscopy has proposed several quality metrics to establish competence and help define areas of continuous quality improvement. These metrics represent quality in endoscopy pertinent to pre-, intra-, and postprocedural periods. Quality in endoscopy is a dynamic and multidimensional process that requires continuous monitoring of several indicators and benchmarking with local and national standards. Institutions and practices should have a process in place for credentialing endoscopists and for the assessment of competence regarding individual endoscopic procedures. PMID:24711767

  3. The Kerr metric

    NASA Astrophysics Data System (ADS)

    Teukolsky, Saul A.

    2015-06-01

    This review describes the events leading up to the discovery of the Kerr metric in 1963 and the enormous impact the discovery has had in the subsequent 50 years. The review discusses the Penrose process, the four laws of black hole mechanics, uniqueness of the solution, and the no-hair theorems. It also includes Kerr perturbation theory and its application to black hole stability and quasi-normal modes. The Kerr metric's importance in the astrophysics of quasars and accreting stellar-mass black hole systems is detailed. A theme of the review is the ‘miraculous’ nature of the solution, both in describing in a simple analytic formula the most general rotating black hole, and in having unexpected mathematical properties that make many calculations tractable. Also included is a pedagogical derivation of the solution suitable for a first course in general relativity.

  4. Performance Metrics for Commercial Buildings

    SciTech Connect

    Fowler, Kimberly M.; Wang, Na; Romero, Rachel L.; Deru, Michael P.

    2010-09-30

    Commercial building owners and operators have requested a standard set of key performance metrics to provide a systematic way to evaluate the performance of their buildings. The performance metrics included in this document provide standard metrics for the energy, water, operations and maintenance, indoor environmental quality, purchasing, waste and recycling and transportation impact of their building. The metrics can be used for comparative performance analysis between existing buildings and industry standards to clarify the impact of sustainably designed and operated buildings.

  5. Aquatic Acoustic Metrics Interface

    SciTech Connect

    2012-12-18

    Fishes and marine mammals may suffer a range of potential effects from exposure to intense underwater sound generated by anthropogenic activities such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording (USR) devices have been built to acquire samples of the underwater sound generated by anthropogenic activities. Software becomes indispensable for processing and analyzing the audio files recorded by these USRs. The new Aquatic Acoustic Metrics Interface Utility Software (AAMI) is specifically designed for analysis of underwater sound recordings to provide data in metrics that facilitate evaluation of the potential impacts of the sound on aquatic animals. In addition to the basic functions, such as loading and editing audio files recorded by USRs and batch processing of sound files, the software utilizes recording system calibration data to compute important parameters in physical units. The software also facilitates comparison of the noise sound sample metrics with biological measures such as audiograms of the sensitivity of aquatic animals to the sound, integrating various components into a single analytical frame.

  6. Metrics for Energy Resilience

    SciTech Connect

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  7. Parallel Anisotropic Tetrahedral Adaptation

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Darmofal, David L.

    2008-01-01

    An adaptive method that robustly produces high aspect ratio tetrahedra to a general 3D metric specification without introducing hybrid semi-structured regions is presented. The elemental operators and higher-level logic is described with their respective domain-decomposed parallelizations. An anisotropic tetrahedral grid adaptation scheme is demonstrated for 1000-1 stretching for a simple cube geometry. This form of adaptation is applicable to more complex domain boundaries via a cut-cell approach as demonstrated by a parallel 3D supersonic simulation of a complex fighter aircraft. To avoid the assumptions and approximations required to form a metric to specify adaptation, an approach is introduced that directly evaluates interpolation error. The grid is adapted to reduce and equidistribute this interpolation error calculation without the use of an intervening anisotropic metric. Direct interpolation error adaptation is illustrated for 1D and 3D domains.

  8. Some References on Metric Information.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC.

    This resource work lists metric information published by the U.S. Government and the American National Standards Institute. Also organizations marketing metric materials for education are given. A short table of conversions is included as is a listing of basic metric facts for everyday living. (LS)

  9. Metrication, American Style. Fastback 41.

    ERIC Educational Resources Information Center

    Izzi, John

    The purpose of this pamphlet is to provide a starting point of information on the metric system for any concerned or interested reader. The material is organized into five brief chapters: Man and Measurement; Learning the Metric System; Progress Report: Education; Recommended Sources; and Metrication, American Style. Appendixes include an…

  10. Say "Yes" to Metric Measure.

    ERIC Educational Resources Information Center

    Monroe, Eula Ewing; Nelson, Marvin N.

    2000-01-01

    Provides a brief history of the metric system. Discusses the infrequent use of the metric measurement system in the United States, why conversion from the customary system to the metric system is difficult, and the need for change. (Contains 14 resources.) (ASK)

  11. Metrication in a global environment

    NASA Technical Reports Server (NTRS)

    Aberg, J.

    1994-01-01

    A brief history about the development of the metric system of measurement is given. The need for the U.S. to implement the 'SI' metric system in the international markets, especially in the aerospace and general trade, is discussed. Development of metric implementation and experiences locally, nationally, and internationally are included.

  12. Metric Education Plan for Virginia.

    ERIC Educational Resources Information Center

    Virginia State Dept. of Education, Richmond. Div. of Secondary Education.

    This comprehensive document is the result of statewide planning for the implementation of metric education in Virginia schools. An introduction discusses the rationale, needs, and continuing objectives for metric education. An organizational structure for metric education in Virginia is outlined. Guidelines for administrative planning are…

  13. Metric Education for Adult Learners.

    ERIC Educational Resources Information Center

    Goetsch, David L.

    1978-01-01

    The metric education program developed at Okaloosa-Walton Junior College, Niceville, Florida, for students and the community in general consists of three components: a metric measurement course; multimedia labor for independent study; and metric signs located throughout the campus. Instructional approaches for adult students are noted. (MF)

  14. Implications of Metric Choice for Common Applications of Readmission Metrics

    PubMed Central

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS). Data Sources. 2000–2009 California Office of Statewide Health Planning and Development Patient Discharge Data Nonpublic file. Study Design. We calculated 30-day readmission rates using three metrics, for three disease groups: heart failure (HF), acute myocardial infarction (AMI), and pneumonia. Using each metric, we calculated the absolute change and correlation between performance; the percent of hospitals remaining in extreme deciles and level of agreement; and differences in longitudinal performance. Principal Findings. Average hospital rates for HF patients and the CMS metric were generally higher than for other conditions and metrics. Correlations between the ACR and CMS metrics were highest (r = 0.67–0.84). Rates calculated using the PPR and either ACR or CMS metrics were moderately correlated (r = 0.50–0.67). Between 47 and 75 percent of hospitals in an extreme decile according to one metric remained when using a different metric. Correlations among metrics were modest when measuring hospital longitudinal change. Conclusions. Different approaches to computing readmissions can produce different hospital rankings and impact pay-for-performance. Careful consideration should be placed on readmission metric choice for these applications. PMID:23742056

  15. Implicit Contractive Mappings in Modular Metric and Fuzzy Metric Spaces

    PubMed Central

    Hussain, N.; Salimi, P.

    2014-01-01

    The notion of modular metric spaces being a natural generalization of classical modulars over linear spaces like Lebesgue, Orlicz, Musielak-Orlicz, Lorentz, Orlicz-Lorentz, and Calderon-Lozanovskii spaces was recently introduced. In this paper we investigate the existence of fixed points of generalized α-admissible modular contractive mappings in modular metric spaces. As applications, we derive some new fixed point theorems in partially ordered modular metric spaces, Suzuki type fixed point theorems in modular metric spaces and new fixed point theorems for integral contractions. In last section, we develop an important relation between fuzzy metric and modular metric and deduce certain new fixed point results in triangular fuzzy metric spaces. Moreover, some examples are provided here to illustrate the usability of the obtained results. PMID:25003157

  16. Random Kähler metrics

    NASA Astrophysics Data System (ADS)

    Ferrari, Frank; Klevtsov, Semyon; Zelditch, Steve

    2013-04-01

    The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kähler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kähler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kähler metrics. Several examples are considered.

  17. Optical metrics and projective equivalence

    SciTech Connect

    Casey, Stephen; Dunajski, Maciej; Gibbons, Gary; Warnick, Claude

    2011-04-15

    Trajectories of light rays in a static spacetime are described by unparametrized geodesics of the Riemannian optical metric associated with the Lorentzian spacetime metric. We investigate the uniqueness of this structure and demonstrate that two different observers, moving relative to one another, who both see the Universe as static may determine the geometry of the light rays differently. More specifically, we classify Lorentzian metrics admitting more than one hyper-surface orthogonal timelike Killing vector and analyze the projective equivalence of the resulting optical metrics. These metrics are shown to be projectively equivalent up to diffeomorphism if the static Killing vectors generate a group SL(2,R), but not projectively equivalent in general. We also consider the cosmological C metrics in Einstein-Maxwell theory and demonstrate that optical metrics corresponding to different values of the cosmological constant are projectively equivalent.

  18. Comparing Resource Adequacy Metrics

    SciTech Connect

    Ibanez, Eduardo; Milligan, Michael

    2014-11-13

    As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

  19. SI (Metric) handbook

    NASA Technical Reports Server (NTRS)

    Artusa, Elisa A.

    1994-01-01

    This guide provides information for an understanding of SI units, symbols, and prefixes; style and usage in documentation in both the US and in the international business community; conversion techniques; limits, fits, and tolerance data; and drawing and technical writing guidelines. Also provided is information of SI usage for specialized applications like data processing and computer programming, science, engineering, and construction. Related information in the appendixes include legislative documents, historical and biographical data, a list of metric documentation, rules for determining significant digits and rounding, conversion factors, shorthand notation, and a unit index.

  20. Degenerate metric phase boundaries

    NASA Astrophysics Data System (ADS)

    Bengtsson, I.; Jacobson, T.

    1997-11-01

    The structure of boundaries between degenerate and non-degenerate solutions of Ashtekar's canonical reformulation of Einstein's equations is studied. Several examples are given of such `phase boundaries' in which the metric is degenerate on one side of a null hypersurface and non-degenerate on the other side. These include portions of flat space, Schwarzschild and plane-wave solutions joined to degenerate regions. In the last case, the wave collides with a planar phase boundary and continues on with the same curvature but degenerate triad, while the phase boundary continues in the opposite direction. We conjecture that degenerate phase boundaries are always null.

  1. Distance Metric Tracking

    DTIC Science & Technology

    2016-03-02

    520, 2004. 16 [12] E.C. Hall and R.M. Willett. Online convex optimization in dynamic environ- ments. Selected Topics in Signal Processing, IEEE Journal...Conference on Machine Learning, pages 1160–1167. ACM, 2008. [25] Eric P Xing, Michael I Jordan, Stuart Russell, and Andrew Y Ng. Distance metric...whereBψ is any Bregman divergence and ηt is the learning rate parameter. From ( Hall & Willett, 2015) we have: Theorem 1. G` = max θ∈Θ,`∈L ‖∇f(θ)‖ φmax = 1

  2. Metrics for Multiagent Systems

    NASA Astrophysics Data System (ADS)

    Lass, Robert N.; Sultanik, Evan A.; Regli, William C.

    A Multiagent System (MAS) is a software paradigm for building large scale intelligent distributed systems. Increasingly these systems are being deployed on handheld computing devices that rely on non-traditional communications mediums such as mobile ad hoc networks and satellite links. These systems present new challenges for computer scientists in describing system performance and analyzing competing systems. This chapter surveys existing metrics that can be used to describe MASs and related components. A framework for analyzing MASs is provided and an example of how this framework might be employed is given for the domain of distributed constraint reasoning.

  3. SI (Metric) handbook

    NASA Astrophysics Data System (ADS)

    Artusa, Elisa A.

    1994-03-01

    This guide provides information for an understanding of SI units, symbols, and prefixes; style and usage in documentation in both the US and in the international business community; conversion techniques; limits, fits, and tolerance data; and drawing and technical writing guidelines. Also provided is information of SI usage for specialized applications like data processing and computer programming, science, engineering, and construction. Related information in the appendixes include legislative documents, historical and biographical data, a list of metric documentation, rules for determining significant digits and rounding, conversion factors, shorthand notation, and a unit index.

  4. Validating Software Metrics

    DTIC Science & Technology

    1990-09-30

    validation test: Spearman Rank Correlation and Wald - Wolfowitz Runs Test (test for randomness) (5,8]. For example, if a complexity metric is claimed to be... error count (E). Validity Criteria Select values of V, B, A, and P. The values of V, B, A, and P, used in the example are .7, .7, 20%, and 80...Procedures with no errors Average rank of first group = 85.2419 based on 31 values . Average rank of second group = 45.5 based on 81 values . Large sample test

  5. Sustainable chemistry metrics.

    PubMed

    Calvo-Flores, Francisco García

    2009-01-01

    Green chemistry has developed mathematical parameters to describe the sustainability of chemical reactions and processes, in order to quantify their environmental impact. These parameters are related to mass and energy magnitudes, and enable analyses and numerical diagnoses of chemical reactions. The environmental impact factor (E factor), atom economy, and reaction mass efficiency have been the most influential metrics, and they are interconnected by mathematical equations. The ecodesign concept must also be considered for complex industrial syntheses, as a part of the sustainability of manufacturing processes. The aim of this Concept article is to identify the main parameters for evaluating undesirable environmental consequences.

  6. Early Warning Look Ahead Metrics: The Percent Milestone Backlog Metric

    NASA Technical Reports Server (NTRS)

    Shinn, Stephen A.; Anderson, Timothy P.

    2017-01-01

    All complex development projects experience delays and corresponding backlogs of their project control milestones during their acquisition lifecycles. NASA Goddard Space Flight Center (GSFC) Flight Projects Directorate (FPD) teamed with The Aerospace Corporation (Aerospace) to develop a collection of Early Warning Look Ahead metrics that would provide GSFC leadership with some independent indication of the programmatic health of GSFC flight projects. As part of the collection of Early Warning Look Ahead metrics, the Percent Milestone Backlog metric is particularly revealing, and has utility as a stand-alone execution performance monitoring tool. This paper describes the purpose, development methodology, and utility of the Percent Milestone Backlog metric. The other four Early Warning Look Ahead metrics are also briefly discussed. Finally, an example of the use of the Percent Milestone Backlog metric in providing actionable insight is described, along with examples of its potential use in other commodities.

  7. Enhanced Data Representation by Kernel Metric Learning for Dementia Diagnosis.

    PubMed

    Cárdenas-Peña, David; Collazos-Huertas, Diego; Castellanos-Dominguez, German

    2017-01-01

    Alzheimer's disease (AD) is the kind of dementia that affects the most people around the world. Therefore, an early identification supporting effective treatments is required to increase the life quality of a wide number of patients. Recently, computer-aided diagnosis tools for dementia using Magnetic Resonance Imaging scans have been successfully proposed to discriminate between patients with AD, mild cognitive impairment, and healthy controls. Most of the attention has been given to the clinical data, provided by initiatives as the ADNI, supporting reliable researches on intervention, prevention, and treatments of AD. Therefore, there is a need for improving the performance of classification machines. In this paper, we propose a kernel framework for learning metrics that enhances conventional machines and supports the diagnosis of dementia. Our framework aims at building discriminative spaces through the maximization of center kernel alignment function, aiming at improving the discrimination of the three considered neurological classes. The proposed metric learning performance is evaluated on the widely-known ADNI database using three supervised classification machines (k-nn, SVM and NNs) for multi-class and bi-class scenarios from structural MRIs. Specifically, from ADNI collection 286 AD patients, 379 MCI patients and 231 healthy controls are used for development and validation of our proposed metric learning framework. For the experimental validation, we split the data into two subsets: 30% of subjects used like a blindfolded assessment and 70% employed for parameter tuning. Then, in the preprocessing stage, each structural MRI scan a total of 310 morphological measurements are automatically extracted from by FreeSurfer software package and concatenated to build an input feature matrix. Obtained test performance results, show that including a supervised metric learning improves the compared baseline classifiers in both scenarios. In the multi-class scenario

  8. Exploring Metric Symmetry

    SciTech Connect

    Zwart, P.H.; Grosse-Kunstleve, R.W.; Adams, P.D.

    2006-07-31

    Relatively minor perturbations to a crystal structure can in some cases result in apparently large changes in symmetry. Changes in space group or even lattice can be induced by heavy metal or halide soaking (Dauter et al, 2001), flash freezing (Skrzypczak-Jankun et al, 1996), and Se-Met substitution (Poulsen et al, 2001). Relations between various space groups and lattices can provide insight in the underlying structural causes for the symmetry or lattice transformations. Furthermore, these relations can be useful in understanding twinning and how to efficiently solve two different but related crystal structures. Although (pseudo) symmetric properties of a certain combination of unit cell parameters and a space group are immediately obvious (such as a pseudo four-fold axis if a is approximately equal to b in an orthorhombic space group), other relations (e.g. Lehtio, et al, 2005) that are less obvious might be crucial to the understanding and detection of certain idiosyncrasies of experimental data. We have developed a set of tools that allows straightforward exploration of possible metric symmetry relations given unit cell parameters and a space group. The new iotbx.explore{_}metric{_}symmetry command produces an overview of the various relations between several possible point groups for a given lattice. Methods for finding relations between a pair of unit cells are also available. The tools described in this newsletter are part of the CCTBX libraries, which are included in the latest (versions July 2006 and up) PHENIX and CCI Apps distributions.

  9. Handbook of aircraft noise metrics

    NASA Technical Reports Server (NTRS)

    Bennett, R. L.; Pearsons, K. S.

    1981-01-01

    Information is presented on 22 noise metrics that are associated with the measurement and prediction of the effects of aircraft noise. Some of the instantaneous frequency weighted sound level measures, such as A-weighted sound level, are used to provide multiple assessment of the aircraft noise level. Other multiple event metrics, such as day-night average sound level, were designed to relate sound levels measured over a period of time to subjective responses in an effort to determine compatible land uses and aid in community planning. The various measures are divided into: (1) instantaneous sound level metrics; (2) duration corrected single event metrics; (3) multiple event metrics; and (4) speech communication metrics. The scope of each measure is examined in terms of its: definition, purpose, background, relationship to other measures, calculation method, example, equipment, references, and standards.

  10. Ligand efficiency metrics considered harmful

    NASA Astrophysics Data System (ADS)

    Kenny, Peter W.; Leitão, Andrei; Montanari, Carlos A.

    2014-07-01

    Ligand efficiency metrics are used in drug discovery to normalize biological activity or affinity with respect to physicochemical properties such as lipophilicity and molecular size. This Perspective provides an overview of ligand efficiency metrics and summarizes thermodynamics of protein-ligand binding. Different classes of ligand efficiency metric are critically examined and the study concludes with suggestions for alternative ways to account for physicochemical properties when prioritizing and optimizing leads.

  11. Do-It-Yourself Metrics

    ERIC Educational Resources Information Center

    Klubeck, Martin; Langthorne, Michael; Padgett, Don

    2006-01-01

    Something new is on the horizon, and depending on one's role on campus, it might be storm clouds or a cleansing shower. Either way, no matter how hard one tries to avoid it, sooner rather than later he/she will have to deal with metrics. Metrics do not have to cause fear and resistance. Metrics can, and should, be a powerful tool for improvement.…

  12. The metric system: An introduction

    SciTech Connect

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  13. The metric system: An introduction

    NASA Astrophysics Data System (ADS)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  14. A comparative study of the SVM and K-nn machine learning algorithms for the diagnosis of respiratory pathologies using pulmonary acoustic signals.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-06-27

    Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database. The pulmonary acoustic signals used in this study were obtained from the R.A.L.E lung sound database. The pulmonary acoustic signals were manually categorised into three different groups, namely normal, airway obstruction pathology, and parenchymal pathology. The mel-frequency cepstral coefficient (MFCC) features were extracted from the pre-processed pulmonary acoustic signals. The MFCC features were analysed by one-way ANOVA and then fed separately into the SVM and K-nn classifiers. The performances of the classifiers were analysed using the confusion matrix technique. The statistical analysis of the MFCC features using one-way ANOVA showed that the extracted MFCC features are significantly different (p < 0.001). The classification accuracies of the SVM and K-nn classifiers were found to be 92.19% and 98.26%, respectively. Although the data used to train and test the classifiers are limited, the classification accuracies found are satisfactory. The K-nn classifier was better than the SVM classifier for the discrimination of pulmonary acoustic signals from pathological and normal subjects obtained from the RALE database.

  15. A comparative study of the svm and k-nn machine learning algorithms for the diagnosis of respiratory pathologies using pulmonary acoustic signals

    PubMed Central

    2014-01-01

    Background Pulmonary acoustic parameters extracted from recorded respiratory sounds provide valuable information for the detection of respiratory pathologies. The automated analysis of pulmonary acoustic signals can serve as a differential diagnosis tool for medical professionals, a learning tool for medical students, and a self-management tool for patients. In this context, we intend to evaluate and compare the performance of the support vector machine (SVM) and K-nearest neighbour (K-nn) classifiers in diagnosis respiratory pathologies using respiratory sounds from R.A.L.E database. Results The pulmonary acoustic signals used in this study were obtained from the R.A.L.E lung sound database. The pulmonary acoustic signals were manually categorised into three different groups, namely normal, airway obstruction pathology, and parenchymal pathology. The mel-frequency cepstral coefficient (MFCC) features were extracted from the pre-processed pulmonary acoustic signals. The MFCC features were analysed by one-way ANOVA and then fed separately into the SVM and K-nn classifiers. The performances of the classifiers were analysed using the confusion matrix technique. The statistical analysis of the MFCC features using one-way ANOVA showed that the extracted MFCC features are significantly different (p < 0.001). The classification accuracies of the SVM and K-nn classifiers were found to be 92.19% and 98.26%, respectively. Conclusion Although the data used to train and test the classifiers are limited, the classification accuracies found are satisfactory. The K-nn classifier was better than the SVM classifier for the discrimination of pulmonary acoustic signals from pathological and normal subjects obtained from the RALE database. PMID:24970564

  16. Image Labeling for LIDAR Intensity Image Using K-Nn of Feature Obtained by Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Umemura, Masaki; Hotta, Kazuhiro; Nonaka, Hideki; Oda, Kazuo

    2016-06-01

    We propose an image labeling method for LIDAR intensity image obtained by Mobile Mapping System (MMS) using K-Nearest Neighbor (KNN) of feature obtained by Convolutional Neural Network (CNN). Image labeling assigns labels (e.g., road, cross-walk and road shoulder) to semantic regions in an image. Since CNN is effective for various image recognition tasks, we try to use the feature of CNN (Caffenet) pre-trained by ImageNet. We use 4,096-dimensional feature at fc7 layer in the Caffenet as the descriptor of a region because the feature at fc7 layer has effective information for object classification. We extract the feature by the Caffenet from regions cropped from images. Since the similarity between features reflects the similarity of contents of regions, we can select top K similar regions cropped from training samples with a test region. Since regions in training images have manually-annotated ground truth labels, we vote the labels attached to top K similar regions to the test region. The class label with the maximum vote is assigned to each pixel in the test image. In experiments, we use 36 LIDAR intensity images with ground truth labels. We divide 36 images into training (28 images) and test sets (8 images). We use class average accuracy and pixel-wise accuracy as evaluation measures. Our method was able to assign the same label as human beings in 97.8% of the pixels in test LIDAR intensity images.

  17. Implementing the Metric System in Business Occupations. Metric Implementation Guide.

    ERIC Educational Resources Information Center

    Retzer, Kenneth A.; And Others

    Addressed to the business education teacher, this guide is intended to provide appropriate information, viewpoints, and attitudes regarding the metric system and to make suggestions regarding presentation of the material in the classroom. An introductory section on teaching suggestions emphasizes the need for a "think metric" approach made up of…

  18. Implementing the Metric System in Industrial Occupations. Metric Implementation Guide.

    ERIC Educational Resources Information Center

    Retzer, Kenneth A.

    Addressed to the industrial education teacher, this guide is intended to provide appropriate information, viewpoints, and attitudes regarding the metric system and to make suggestions regarding presentation of the material in the classroom. An introductory section on teaching suggestions emphasizes the need for a "think metric" approach made up of…

  19. Implementing the Metric System in Health Occupations. Metric Implementation Guide.

    ERIC Educational Resources Information Center

    Banks, Wilson P.; And Others

    Addressed to the health occupations education teacher, this guide is intended to provide appropriate information, viewpoints, and attitudes regarding the metric system and to make suggestions regarding presentation of the material in the classroom. An introductory section on teaching suggestions emphasizes the need for a "think metric" approach…

  20. Implementing the Metric System in Agricultural Occupations. Metric Implementation Guide.

    ERIC Educational Resources Information Center

    Gilmore, Hal M.; And Others

    Addressed to the agricultural education teacher, this guide is intended to provide appropriate information, viewpoints, and attitudes regarding the metric system and to make suggestions regarding presentation of the material in the classroom. An introductory section on teaching suggestions emphasizes the need for a "think metric" approach made up…

  1. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  2. Cancer therapy prognosis using quantitative ultrasound spectroscopy and a kernel-based metric

    NASA Astrophysics Data System (ADS)

    Gangeh, Mehrdad J.; Hashim, Amr; Giles, Anoja; Czarnota, Gregory J.

    2014-03-01

    In this study, a kernel-based metric based on the Hilbert-Schmidt independence criterion (HSIC) is proposed in a computer-aided-prognosis system to monitor cancer therapy effects. In order to induce tumour cell death, sarcoma xenograft tumour-bearing mice were injected with microbubbles followed by ultrasound and X-ray radiation therapy successively as a new anti-vascular treatment. High frequency (central frequency 30 MHz) ultrasound imaging was performed before and at different times after treatment and using spectroscopy, quantitative ultrasound (QUS) parametric maps were derived from the radiofrequency (RF) signals. The intensity histogram of midband fit parametric maps was computed to represent the pre- and post-treatment images. Subsequently, the HSIC-based metric between preand post-treatment samples were computed for each animal as a measure of distance between the two distributions. The HSIC-based metrics computes the distance between two distributions in a reproducing kernel Hilbert space (RKHS), meaning that by using a kernel, the input vectors are non-linearly mapped into a different, possibly high dimensional feature space. Computing the population means in this new space, enhanced group separability (compared to, e.g., Euclidean distance in the original feature space) is ideally obtained. The pre- and post-treatment parametric maps for each animal were thus represented by a dissimilarity measure, in which a high value of this metric indicated more treatment effect on the animal. It was shown in this research that this metric has a high correlation with cell death and if it was used in supervised learning, a high accuracy classification was obtained using a k-nearest-neighbor (k-NN) classifier.

  3. Fast clustering algorithm for large ECG data sets based on CS theory in combination with PCA and K-NN methods.

    PubMed

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2014-01-01

    Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.

  4. The implicit learning of metrical and non-metrical rhythms in blind and sighted adults.

    PubMed

    Carrara-Augustenborg, Claudia; Schultz, Benjamin G

    2017-09-15

    Forming temporal expectancies plays a crucial role in our survival as it allows us to identify the occurrence of temporal deviants that might signal potential dangers. The dynamic attending theory suggests that temporal expectancies are formed more readily for rhythms that imply a beat (i.e., metrical rhythms) compared to those that do not (i.e., nonmetrical rhythms). Moreover, metrical frameworks can be used to detect temporal deviants. Although several studies have demonstrated that congenital or early blindness correlates with modality-specific neural changes that reflect compensatory mechanisms, few have examined whether blind individuals show a learning advantage for auditory rhythms and whether learning can occur unintentionally and without awareness, that is, implicitly. We compared blind to sighted controls in their ability to implicitly learn metrical and nonmetrical auditory rhythms. We reasoned that the loss of sight in blindness might lead to improved sensitivity to rhythms and predicted that the blind learn rhythms more readily than the sighted. We further hypothesized that metrical rhythms are learned more readily than nonmetrical rhythms. Results partially confirmed our predictions; the blind group learned nonmetrical rhythms more readily than the sighted group but the blind group learned metrical rhythms less readily than the sighted group. Only the sighted group learned metrical rhythms more readily than nonmetrical rhythms. The blind group demonstrated awareness of the nonmetrical rhythms while learning was implicit for all other conditions. Findings suggest that improved deviant-sensitivity might have provided the blind group a learning advantage for nonmetrical rhythms. Future research could explore the plastic changes that affect deviance-detection and stimulus-specific adaptation in blindness.

  5. Toward a simplified perceptual quality metric for watermarking applications

    NASA Astrophysics Data System (ADS)

    Carosi, Maurizio; Pankajakshan, Vinod; Autrusseau, Florent

    2010-01-01

    This work is motivated by the limitations of statistical quality metrics to assess the quality of images distorted in distinct frequency ranges. Common quality metrics, which basically have been designed and tested for various kind of global distortions, such as image coding may not be efficient for watermarking applications, where the distortions might be restricted on a very narrow portion of the frequency spectrum. We hereby want to propose an objective quality metric whose performances do not depend on the distortion frequency range, but we nevertheless want to provide a simplified objective quality metric in opposition to the complex Human Visual System (HVS) based quality metrics recently made available. The proposed algorithm is generic (not designed for a particular distortion), and exploits the contrast sensitivity function (CSF) along with an adapted Minkowski error pooling. The results show a high correlation between the proposed objective metric and the mean opinion score (MOS) given by observers. A comparison with relevant existing objective quality metrics is provided.

  6. Multimetric indices: How many metrics?

    EPA Science Inventory

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  7. Metrical Phonology: German Sound System.

    ERIC Educational Resources Information Center

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  8. Metrics for Soft Goods Merchandising.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in soft goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  9. Metrics for Hard Goods Merchandising.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students interested in hard goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  10. Metrication: A Guide for Consumers.

    ERIC Educational Resources Information Center

    Consumer and Corporate Affairs Dept., Ottawa (Ontario).

    The widespread use of the metric system by most of the major industrial powers of the world has prompted the Canadian government to investigate and consider use of the system. This booklet was developed to aid the consuming public in Canada in gaining some knowledge of metrication and how its application would affect their present economy.…

  11. Metrics--Libraries and Librarians

    ERIC Educational Resources Information Center

    Hall, Vivian S.; Anderson, Gregg

    1975-01-01

    The 1975 librarian must determine whether to begin collecting materials on the International System of Measurements (metric system). Librarians are urged to learn and use the metric system, provide displays, and collect materials to better serve their patrons. Bibliography. (Author/LS)

  12. Inching toward the Metric System.

    ERIC Educational Resources Information Center

    Moore, Randy

    1989-01-01

    Provides an overview and description of the metric system. Discusses the evolution of measurement systems and their early cultures, the beginnings of metric measurement, the history of measurement systems in the United States, the International System of Units, its general style and usage, and supplementary units. (RT)

  13. Numerical Calabi-Yau metrics

    NASA Astrophysics Data System (ADS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, René

    2008-03-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results.

  14. How to Teach Metric Now.

    ERIC Educational Resources Information Center

    Worcester Public Schools, MA.

    This curriculum guide for grades K-6 was prepared to assist teachers and students in learning about the metric system. An introductory section presents a brief history of the metric system and the rationale for introducing it into the schools. Instructional objectives and suggested learning activities are presented for each grade level. The…

  15. Metric Activities, Grades K-6.

    ERIC Educational Resources Information Center

    Draper, Bob, Comp.

    This pamphlet presents worksheets for use in fifteen activities or groups of activities designed for teaching the metric system to children in grades K through 6. The approach taken in several of the activities is one of conversion between metric and English units. The majority of the activities concern length, area, volume, and capacity. A…

  16. What About Metric? Revised Edition.

    ERIC Educational Resources Information Center

    Barbrow, Louis E.

    Described are the advantages of using the metric system over the English system. The most common units of both systems are listed and compared. Pictures are used to exhibit use of the metric system in connection with giving prices or sizes of common items. Several examples provide computations of area, total weight of several objects, and volume;…

  17. Conversion to the Metric System

    ERIC Educational Resources Information Center

    Crunkilton, John C.; Lee, Jasper S.

    1974-01-01

    The authors discuss background information about the metric system and explore the effect of metrication of agriculture in areas such as equipment calibration, chemical measurement, and marketing of agricultural products. Suggestions are given for possible leadership roles and approaches that agricultural education might take in converting to the…

  18. Metric Supplement to Technical Drawing.

    ERIC Educational Resources Information Center

    Henschel, Mark

    This manual is intended for use in training persons whose vocations involve technical drawing to use the metric system of measurement. It could be used in a short course designed for that purpose or for individual study. The manual begins with a brief discussion of the rationale for conversion to the metric system. It then provides a…

  19. Metrication: A Guide for Consumers.

    ERIC Educational Resources Information Center

    Consumer and Corporate Affairs Dept., Ottawa (Ontario).

    The widespread use of the metric system by most of the major industrial powers of the world has prompted the Canadian government to investigate and consider use of the system. This booklet was developed to aid the consuming public in Canada in gaining some knowledge of metrication and how its application would affect their present economy.…

  20. Metrication report to the Congress

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The major NASA metrication activity of 1988 concerned the Space Station. Although the metric system was the baseline measurement system for preliminary design studies, solicitations for final design and development of the Space Station Freedom requested use of the inch-pound system because of concerns with cost impact and potential safety hazards. Under that policy, however use of the metric system would be permitted through waivers where its use was appropriate. Late in 1987, several Department of Defense decisions were made to increase commitment to the metric system, thereby broadening the potential base of metric involvement in the U.S. industry. A re-evaluation of Space Station Freedom units of measure policy was, therefore, initiated in January 1988.

  1. Poisson disk sampling in geodesic metric for DEM simplification

    NASA Astrophysics Data System (ADS)

    Hou, Wenguang; Zhang, Xuming; Li, Xin; Lai, Xudong; Ding, Mingyue

    2013-08-01

    To generate highly compressed digital elevation models (DEMs) with fine details, the method of Poisson disk sampling in geodesic metric is proposed. The main idea is to uniformly pick points from DEM nodes in geodesic metric, resulting in terrain-adaptive samples in Euclidean metric. This method randomly selects point from mesh nodes and then judges whether this point can be accepted in accordance with the related geodesic distances from the sampled points. The whole process is repeated until no more points can be selected. To further adjust the sampling ratios in different areas, weighted geodesic distance, which is in relation to terrain characteristics, are introduced. In addition to adaptability, sample distributions are well visualised. This method is simple and easy to implement. Cases are provided to illustrate the feasibility and superiority of the proposed method.

  2. Separating metric perturbations in near-horizon extremal Kerr spacetimes

    NASA Astrophysics Data System (ADS)

    Chen, Baoyi; Stein, Leo C.

    2017-09-01

    Linear perturbation theory is a powerful toolkit for studying black hole spacetimes. However, the perturbation equations are hard to solve unless we can use separation of variables. In the Kerr spacetime, metric perturbations do not separate, but curvature perturbations do. The cost of curvature perturbations is a very complicated metric-reconstruction procedure. This procedure can be avoided using a symmetry-adapted choice of basis functions in highly symmetric spacetimes, such as near-horizon extremal Kerr. In this paper, we focus on this spacetime and (i) construct the symmetry-adapted basis functions; (ii) show their orthogonality; and (iii) show that they lead to separation of variables of the scalar, Maxwell, and metric perturbation equations. This separation turns the system of partial differential equations into one of ordinary differential equations over a compact domain, the polar angle.

  3. Weyl metrics and wormholes

    NASA Astrophysics Data System (ADS)

    Gibbons, Gary W.; Volkov, Mikhail S.

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  4. Quantum correlations for the metric

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2017-06-01

    We discuss the correlation function for the metric for homogeneous and isotropic cosmologies. The exact propagator equation determines the correlation function as the inverse of the second functional derivative of the quantum effective action. This formulation relates the metric correlation function employed in quantum gravity computations to cosmological observables as the graviton power spectrum. In the Einstein-Hilbert approximation for the effective action the on-shell graviton correlation function can be obtained equivalently from a product of mode functions which are solutions of the linearized Einstein equations. In contrast, the product of mode functions, often employed in the context of cosmology, does not yield the correlation function for the vector and scalar components of the metric fluctuations. We divide the metric fluctuations into "physical fluctuations," which couple to a conserved energy momentum tensor, and gauge fluctuations. On the subspace of physical metric fluctuations the relation to physical sources becomes invertible, such that the effective action and its relation to correlation functions no longer needs to involve a gauge fixing term. The physical metric fluctuations have a similar status as the Bardeen potentials, while being formulated in a covariant way. We compute the effective action for the physical metric fluctuations for geometries corresponding to realistic cosmologies.

  5. A Dynamic Testing Complexity Metric

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey

    1991-01-01

    This paper introduces a dynamic metric that is based on the estimated ability of a program to withstand the effects of injected "semantic mutants" during execution by computing the same function as if the semantic mutants had not been injected. Semantic mutants include: (1) syntactic mutants injected into an executing program and (2) randomly selected values injected into an executing program's internal states. The metric is a function of a program, the method used for injecting these two types of mutants, and the program's input distribution; this metric is found through dynamic executions of the program. A program's ability to withstand the effects of injected semantic mutants by computing the same function when executed is then used as a tool for predicting the difficulty that will be incurred during random testing to reveal the existence of faults, i.e., the metric suggests the likelihood that a program will expose the existence of faults during random testing assuming faults were to exist. If the metric is applied to a module rather than to a program, the metric can be used to guide the allocation of testing resources among a program's modules. In this manner the metric acts as a white-box testing tool for determining where to concentrate testing resources. Index Terms: Revealing ability, random testing, input distribution, program, fault, failure.

  6. Variable metric conjugate gradient methods

    SciTech Connect

    Barth, T.; Manteuffel, T.

    1994-07-01

    1.1 Motivation. In this paper we present a framework that includes many well known iterative methods for the solution of nonsymmetric linear systems of equations, Ax = b. Section 2 begins with a brief review of the conjugate gradient method. Next, we describe a broader class of methods, known as projection methods, to which the conjugate gradient (CG) method and most conjugate gradient-like methods belong. The concept of a method having either a fixed or a variable metric is introduced. Methods that have a metric are referred to as either fixed or variable metric methods. Some relationships between projection methods and fixed (variable) metric methods are discussed. The main emphasis of the remainder of this paper is on variable metric methods. In Section 3 we show how the biconjugate gradient (BCG), and the quasi-minimal residual (QMR) methods fit into this framework as variable metric methods. By modifying the underlying Lanczos biorthogonalization process used in the implementation of BCG and QMR, we obtain other variable metric methods. These, we refer to as generalizations of BCG and QMR.

  7. A Metric for Heterotic Moduli

    NASA Astrophysics Data System (ADS)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-09-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}} , in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}} -corrected hermitian form.

  8. GPS Metric Tracking Unit

    NASA Technical Reports Server (NTRS)

    2008-01-01

    As Global Positioning Satellite (GPS) applications become more prevalent for land- and air-based vehicles, GPS applications for space vehicles will also increase. The Applied Technology Directorate of Kennedy Space Center (KSC) has developed a lightweight, low-cost GPS Metric Tracking Unit (GMTU), the first of two steps in developing a lightweight, low-cost Space-Based Tracking and Command Subsystem (STACS) designed to meet Range Safety's link margin and latency requirements for vehicle command and telemetry data. The goals of STACS are to improve Range Safety operations and expand tracking capabilities for space vehicles. STACS will track the vehicle, receive commands, and send telemetry data through the space-based asset, which will dramatically reduce dependence on ground-based assets. The other step was the Low-Cost Tracking and Data Relay Satellite System (TDRSS) Transceiver (LCT2), developed by the Wallops Flight Facility (WFF), which allows the vehicle to communicate with a geosynchronous relay satellite. Although the GMTU and LCT2 were independently implemented and tested, the design collaboration of KSC and WFF engineers allowed GMTU and LCT2 to be integrated into one enclosure, leading to the final STACS. In operation, GMTU needs only a radio frequency (RF) input from a GPS antenna and outputs position and velocity data to the vehicle through a serial or pulse code modulation (PCM) interface. GMTU includes one commercial GPS receiver board and a custom board, the Command and Telemetry Processor (CTP) developed by KSC. The CTP design is based on a field-programmable gate array (FPGA) with embedded processors to support GPS functions.

  9. Double metric, generalized metric, and α' -deformed double field theory

    NASA Astrophysics Data System (ADS)

    Hohm, Olaf; Zwiebach, Barton

    2016-03-01

    We relate the unconstrained "double metric" of the "α' -geometry" formulation of double field theory to the constrained generalized metric encoding the spacetime metric and b -field. This is achieved by integrating out auxiliary field components of the double metric in an iterative procedure that induces an infinite number of higher-derivative corrections. As an application, we prove that, to first order in α' and to all orders in fields, the deformed gauge transformations are Green-Schwarz-deformed diffeomorphisms. We also prove that to first order in α' the spacetime action encodes precisely the Green-Schwarz deformation with Chern-Simons forms based on the torsionless gravitational connection. This seems to be in tension with suggestions in the literature that T-duality requires a torsionful connection, but we explain that these assertions are ambiguous since actions that use different connections are related by field redefinitions.

  10. Daylight metrics and energy savings

    SciTech Connect

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  11. Truss Performance and Packaging Metrics

    NASA Technical Reports Server (NTRS)

    Mikulas, Martin M.; Collins, Timothy J.; Doggett, William; Dorsey, John; Watson, Judith

    2006-01-01

    In the present paper a set of performance metrics are derived from first principals to assess the efficiency of competing space truss structural concepts in terms of mass, stiffness, and strength, for designs that are constrained by packaging. The use of these performance metrics provides unique insight into the primary drivers for lowering structural mass and packaging volume as well as enabling quantitative concept performance evaluation and comparison. To demonstrate the use of these performance metrics, data for existing structural concepts are plotted and discussed. Structural performance data is presented for various mechanical deployable concepts, for erectable structures, and for rigidizable structures.

  12. A Sequential Development of Metric Measurement Activities for Middle School Students Using the Outdoor Environment.

    ERIC Educational Resources Information Center

    Thompson, Thomas E.; Morford, Catherine A.

    1980-01-01

    Using an adaptation of a measurement model suggested by the California State Department of Education, the following components are synthesized into the middle school curriculum: measurement, metric system, outdoor education, mathematics, and science. Seasonal outdoor metric measurement activities for grades five through eight are presented.…

  13. Measure for Measure: A Guide to Metrication for Workshop Crafts and Technical Studies.

    ERIC Educational Resources Information Center

    Schools Council, London (England).

    This booklet is designed to help teachers of the industrial arts in Great Britain during the changeover to metric units which is due to be substantially completed during the period 1970-1975. General suggestions are given for adapting equipment in metalwork and engineering and woodwork and technical drawing by adding some metric equipment…

  14. Measure for Measure: A Guide to Metrication for Workshop Crafts and Technical Studies.

    ERIC Educational Resources Information Center

    Schools Council, London (England).

    This booklet is designed to help teachers of the industrial arts in Great Britain during the changeover to metric units which is due to be substantially completed during the period 1970-1975. General suggestions are given for adapting equipment in metalwork and engineering and woodwork and technical drawing by adding some metric equipment…

  15. Using TRACI for Sustainability Metrics

    EPA Science Inventory

    TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed for sustainability metrics, life cycle impact assessment, and product and process design impact assessment for developing increasingly sustainable products, processes,...

  16. Let's Make Metric Ice Cream

    ERIC Educational Resources Information Center

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  17. Sizing Up the Metric System.

    ERIC Educational Resources Information Center

    Sherman, Helene J.

    1997-01-01

    Presents estimation as a tool for learning observation and measurement relationships for the metric system. Activities include constructing a meter tape and using mystery boxes to practice volume estimation and measurement. (AIM)

  18. Let's Make Metric Ice Cream

    ERIC Educational Resources Information Center

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  19. Using TRACI for Sustainability Metrics

    EPA Science Inventory

    TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed for sustainability metrics, life cycle impact assessment, and product and process design impact assessment for developing increasingly sustainable products, processes,...

  20. Using principal component analysis for selecting network behavioral anomaly metrics

    NASA Astrophysics Data System (ADS)

    Gregorio-de Souza, Ian; Berk, Vincent; Barsamian, Alex

    2010-04-01

    This work addresses new approaches to behavioral analysis of networks and hosts for the purposes of security monitoring and anomaly detection. Most commonly used approaches simply implement anomaly detectors for one, or a few, simple metrics and those metrics can exhibit unacceptable false alarm rates. For instance, the anomaly score of network communication is defined as the reciprocal of the likelihood that a given host uses a particular protocol (or destination);this definition may result in an unrealistically high threshold for alerting to avoid being flooded by false positives. We demonstrate that selecting and adapting the metrics and thresholds, on a host-by-host or protocol-by-protocol basis can be done by established multivariate analyses such as PCA. We will show how to determine one or more metrics, for each network host, that records the highest available amount of information regarding the baseline behavior, and shows relevant deviances reliably. We describe the methodology used to pick from a large selection of available metrics, and illustrate a method for comparing the resulting classifiers. Using our approach we are able to reduce the resources required to properly identify misbehaving hosts, protocols, or networks, by dedicating system resources to only those metrics that actually matter in detecting network deviations.

  1. Metrics for Linear Kinematic Features in Sea Ice

    NASA Technical Reports Server (NTRS)

    Levy, G.; Coon, M.; Sulsky, D.

    2006-01-01

    The treatment of leads as cracks or discontinuities (see Coon et al. presentation) requires some shift in the procedure of evaluation and comparison of lead-resolving models and their validation against observations. Common metrics used to evaluate ice model skills are by and large an adaptation of a least square "metric" adopted from operational numerical weather prediction data assimilation systems and are most appropriate for continuous fields and Eilerian systems where the observations and predictions are commensurate. However, this class of metrics suffers from some flaws in areas of sharp gradients and discontinuities (e.g., leads) and when Lagrangian treatments are more natural. After a brief review of these metrics and their performance in areas of sharp gradients, we present two new metrics specifically designed to measure model accuracy in representing linear features (e.g., leads). The indices developed circumvent the requirement that both the observations and model variables be commensurate (i.e., measured with the same units) by considering the frequencies of the features of interest/importance. We illustrate the metrics by scoring several hypothetical "simulated" discontinuity fields against the lead interpreted from RGPS observations.

  2. SiMPSON: Efficient Similarity Search in Metric Spaces over P2P Structured Overlay Networks

    NASA Astrophysics Data System (ADS)

    Vu, Quang Hieu; Lupu, Mihai; Wu, Sai

    Similarity search in metric spaces over centralized systems has been significantly studied in the database research community. However, not so much work has been done in the context of P2P networks. This paper introduces SiMPSON: a P2P system supporting similarity search in metric spaces. The aim is to answer queries faster and using less resources than existing systems. For this, each peer first clusters its own data using any off-the-shelf clustering algorithms. Then, the resulting clusters are mapped to one-dimensional values. Finally, these one-dimensional values are indexed into a structured P2P overlay. Our method slightly increases the indexing overhead, but allows us to greatly reduce the number of peers and messages involved in query processing: we trade a small amount of overhead in the data publishing process for a substantial reduction of costs in the querying phase. Based on this architecture, we propose algorithms for processing range and kNN queries. Extensive experimental results validate the claims of efficiency and effectiveness of SiMPSON.

  3. Metric Selection for Ecosystem Restoration

    DTIC Science & Technology

    2013-06-01

    Conceptual modeling can be used in a situation where there is little funding for monitoring and evaluation planning, and when planning needs to be done...ecosystem restoration monitoring and evaluation programs, compile a list of these previous metrics, and assess and narrow them down based on...and understanding of the system will likely correlate with the benefits gained from monitoring and evaluation . A more appropriate, robust metric

  4. Electromagnetic Metrics of Mental Workload.

    DTIC Science & Technology

    1987-09-01

    D-AiBS 285 ELECTROMAGNETIC METRICS OF MENTAL AdORIKLOAD(U) PURDUE t/, UNIV LAFAYETTE IN EEG SIGNAL PROCESSING LRB RUNON ET AL SEP 87 AFOSR-TR-87-ib.3...ACCESSION NO 61102F 2313 A4 11 TITLE (Include Security Claiwfication) Electromagnetic Metrics of Mental Workload (U) 12 PERSONAL AUTHOR(S) Aunon, J. I...sustained high level of workload can lead to mental exhaustion. Previous research has indicated that heart rate lariability and evoked potentials in

  5. Coverage Metrics for Model Checking

    NASA Technical Reports Server (NTRS)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  6. Validity of ligand efficiency metrics.

    PubMed

    Murray, Christopher W; Erlanson, Daniel A; Hopkins, Andrew L; Keserü, György M; Leeson, Paul D; Rees, David C; Reynolds, Charles H; Richmond, Nicola J

    2014-06-12

    A recent viewpoint article (Improving the plausibility of success with inefficient metrics. ACS Med. Chem. Lett. 2014, 5, 2-5) argued that the standard definition of ligand efficiency (LE) is mathematically invalid. In this viewpoint, we address this criticism and show categorically that the definition of LE is mathematically valid. LE and other metrics such as lipophilic ligand efficiency (LLE) can be useful during the multiparameter optimization challenge faced by medicinal chemists.

  7. Phylogenetic metrics of community similarity.

    PubMed

    Ives, Anthony R; Helmus, Matthew R

    2010-11-01

    We derive a new metric of community similarity that takes into account the phylogenetic relatedness among species. This metric, phylogenetic community dissimilarity (PCD), can be partitioned into two components, a nonphylogenetic component that reflects shared species between communities (analogous to Sørensen' s similarity metric) and a phylogenetic component that reflects the evolutionary relationships among nonshared species. Therefore, even if a species is not shared between two communities, it will increase the similarity of the two communities if it is phylogenetically related to species in the other community. We illustrate PCD with data on fish and aquatic macrophyte communities from 59 temperate lakes. Dissimilarity between fish communities associated with environmental differences between lakes often has a phylogenetic component, whereas this is not the case for macrophyte communities. With simulations, we then compare PCD with two other metrics of phylogenetic community similarity, II(ST) and UniFrac. Of the three metrics, PCD was best at identifying environmental drivers of community dissimilarity, showing lower variability and greater statistical power. Thus, PCD is a statistically powerful metric that separates the effects of environmental drivers on compositional versus phylogenetic components of community structure.

  8. Object recognition using metric shape.

    PubMed

    Lee, Young-Lim; Lind, Mats; Bingham, Ned; Bingham, Geoffrey P

    2012-09-15

    Most previous studies of 3D shape perception have shown a general inability to visually perceive metric shape. In line with this, studies of object recognition have shown that only qualitative differences, not quantitative or metric ones can be used effectively for object recognition. Recently, Bingham and Lind (2008) found that large perspective changes (≥ 45°) allow perception of metric shape and Lee and Bingham (2010) found that this, in turn, allowed accurate feedforward reaches-to-grasp objects varying in metric shape. We now investigated whether this information would allow accurate and effective recognition of objects that vary in respect to metric shape. Both judgment accuracies (d') and reaction times confirmed that, with the availability of visual information in large perspective changes, recognition of objects using quantitative as compared to qualitative properties was equivalent in accuracy and speed of judgments. The ability to recognize objects based on their metric shape is, therefore, a function of the availability or unavailability of requisite visual information. These issues and results are discussed in the context of the Two Visual System hypothesis of Milner and Goodale (1995, 2006). 2012 Elsevier Ltd. All rights reserved

  9. Improved Adaptive-Reinforcement Learning Control for morphing unmanned air vehicles.

    PubMed

    Valasek, John; Doebbler, James; Tandale, Monish D; Meade, Andrew J

    2008-08-01

    This paper presents an improved Adaptive-Reinforcement Learning Control methodology for the problem of unmanned air vehicle morphing control. The reinforcement learning morphing control function that learns the optimal shape change policy is integrated with an adaptive dynamic inversion control trajectory tracking function. An episodic unsupervised learning simulation using the Q-learning method is developed to replace an earlier and less accurate Actor-Critic algorithm. Sequential Function Approximation, a Galerkin-based scattered data approximation scheme, replaces a K-Nearest Neighbors (KNN) method and is used to generalize the learning from previously experienced quantized states and actions to the continuous state-action space, all of which may not have been experienced before. The improved method showed smaller errors and improved learning of the optimal shape compared to the KNN.

  10. Metrication study for large space telescope

    NASA Technical Reports Server (NTRS)

    Creswick, F. A.; Weller, A. E.

    1973-01-01

    Various approaches which could be taken in developing a metric-system design for the Large Space Telescope, considering potential penalties on development cost and time, commonality with other satellite programs, and contribution to national goals for conversion to the metric system of units were investigated. Information on the problems, potential approaches, and impacts of metrication was collected from published reports on previous aerospace-industry metrication-impact studies and through numerous telephone interviews. The recommended approach to LST metrication formulated in this study cells for new components and subsystems to be designed in metric-module dimensions, but U.S. customary practice is allowed where U.S. metric standards and metric components are not available or would be unsuitable. Electrical/electronic-system design, which is presently largely metric, is considered exempt from futher metrication. An important guideline is that metric design and fabrication should in no way compromise the effectiveness of the LST equipment.

  11. Modestobacter caceresii sp. nov., novel actinobacteria with an insight into their adaptive mechanisms for survival in extreme hyper-arid Atacama Desert soils.

    PubMed

    Busarakam, Kanungnid; Bull, Alan T; Trujillo, Martha E; Riesco, Raul; Sangal, Vartul; van Wezel, Gilles P; Goodfellow, Michael

    2016-06-01

    A polyphasic study was designed to determine the taxonomic provenance of three Modestobacter strains isolated from an extreme hyper-arid Atacama Desert soil. The strains, isolates KNN 45-1a, KNN 45-2b(T) and KNN 45-3b, were shown to have chemotaxonomic and morphological properties in line with their classification in the genus Modestobacter. The isolates had identical 16S rRNA gene sequences and formed a branch in the Modestobacter gene tree that was most closely related to the type strain of Modestobacter marinus (99.6% similarity). All three isolates were distinguished readily from Modestobacter type strains by a broad range of phenotypic properties, by qualitative and quantitative differences in fatty acid profiles and by BOX fingerprint patterns. The whole genome sequence of isolate KNN 45-2b(T) showed 89.3% average nucleotide identity, 90.1% (SD: 10.97%) average amino acid identity and a digital DNA-DNA hybridization value of 42.4±3.1 against the genome sequence of M. marinus DSM 45201(T), values consistent with its assignment to a separate species. On the basis of all of these data, it is proposed that the isolates be assigned to the genus Modestobacter as Modestobacter caceresii sp. nov. with isolate KNN 45-2b(T) (CECT 9023(T)=DSM 101691(T)) as the type strain. Analysis of the whole-genome sequence of M. caceresii KNN 45-2b(T), with 4683 open reading frames and a genome size of ∽4.96Mb, revealed the presence of genes and gene-clusters that encode for properties relevant to its adaptability to harsh environmental conditions prevalent in extreme hyper arid Atacama Desert soils.

  12. Non-metric chaotic inflation

    SciTech Connect

    Enqvist, Kari; Koivisto, Tomi; Rigopoulos, Gerasimos E-mail: T.S.Koivisto@astro.uio.no

    2012-05-01

    We consider inflation within the context of what is arguably the simplest non-metric extension of Einstein gravity. There non-metricity is described by a single graviscalar field with a non-minimal kinetic coupling to the inflaton field Ψ, parameterized by a single parameter γ. There is a simple equivalent description in terms of a massless field and an inflaton with a modified potential. We discuss the implications of non-metricity for chaotic inflation and find that it significantly alters the inflaton dynamics for field values Ψ∼>M{sub P}/γ, dramatically changing the qualitative behaviour in this regime. In the equivalent single-field description this is described as a cuspy potential that forms of barrier beyond which the inflation becomes a ghost field. This imposes an upper bound on the possible number of e-folds. For the simplest chaotic inflation models, the spectral index and the tensor-to-scalar ratio receive small corrections dependent on the non-metricity parameter. We also argue that significant post-inflationary non-metricity may be generated.

  13. Metrics for Occupations. Information Series No. 118.

    ERIC Educational Resources Information Center

    Peterson, John C.

    The metric system is discussed in this information analysis paper with regard to its history, a rationale for the United States' adoption of the metric system, a brief overview of the basic units of the metric system, examples of how the metric system will be used in different occupations, and recommendations for research and development. The…

  14. Metrics for measuring net-centric data strategy implementation

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph B.

    2010-04-01

    An enterprise data strategy outlines an organization's vision and objectives for improved collection and use of data. We propose generic metrics and quantifiable measures for each of the DoD Net-Centric Data Strategy (NCDS) data goals. Data strategy metrics can be adapted to the business processes of an enterprise and the needs of stakeholders in leveraging the organization's data assets to provide for more effective decision making. Generic metrics are applied to a specific application where logistics supply and transportation data is integrated across multiple functional groups. A dashboard presents a multidimensional view of the current progress to a state where logistics data shared in a timely and seamless manner among users, applications, and systems.

  15. Eye Tracking Metrics for Workload Estimation in Flight Deck Operation

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle; Schnell, Thomas

    2010-01-01

    Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.

  16. Synthesis Array Topology Metrics in Location Characterization

    NASA Astrophysics Data System (ADS)

    Shanmugha Sundaram, GA

    2015-08-01

    Towards addressing some of the fundamental mysteries in physics at the micro- and macro-cosm level, that form the Key Science Projects (KSPs) for the Square Kilometer Array (SKA; such as Probing the Dark Ages and the Epoch of Reionization in the course of an Evolving Universe; Galaxy Evolution, Cosmology, and Dark Energy; and the Origin and evolution of Cosmic Magnetism) a suitable interfacing of these goals has to be achieved with its optimally designed array configuration, by means of a critical evaluation of the radio imagingcapabilities and metrics. Of the two forerunner sites, viz. Australia and South Africa, where pioneering advancements to state-of-the-art in synthesis array radio astronomy instrumentation are being attempted in the form of pathfinders to the SKA, for its eventual deployment, a diversity of site-dependent topology and design metrics exists. Here, the particular discussion involves those KSPs that relate to galactic morphology and evolution, and explores their suitability as a scientific research goal from the prespective of the location-driven instrument design specification. Relative merits and adaptability with regard to either site shall be presented from invoking well-founded and established array-design and optimization principles designed into a customized software tool.

  17. Distribution Metrics and Image Segmentation

    PubMed Central

    Georgiou, Tryphon; Michailovich, Oleg; Rathi, Yogesh; Malcolm, James; Tannenbaum, Allen

    2007-01-01

    The purpose of this paper is to describe certain alternative metrics for quantifying distances between distributions, and to explain their use and relevance in visual tracking. Besides the theoretical interest, such metrics may be used to design filters for image segmentation, that is for solving the key visual task of separating an object from the background in an image. The segmenting curve is represented as the zero level set of a signed distance function. Most existing methods in the geometric active contour framework perform segmentation by maximizing the separation of intensity moments between the interior and the exterior of an evolving contour. Here one can use the given distributional metric to determine a flow which minimizes changes in the distribution inside and outside the curve. PMID:18769529

  18. DLA Energy Biofuel Feedstock Metrics Study

    DTIC Science & Technology

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  19. Kerr-Schild–Kundt metrics are universal

    NASA Astrophysics Data System (ADS)

    Gürses, Metin; Çağrı Şişman, Tahsin; Tekin, Bayram

    2017-04-01

    We define (non-Einsteinian) universal metrics as the metrics that solve the source-free covariant field equations of generic gravity theories. Here, extending the rather scarce family of universal metrics known in the literature, we show that the Kerr-Schild–Kundt class of metrics are universal. Besides being interesting on their own, these metrics can provide consistent backgrounds for quantum field theory at extremely high energies.

  20. Predicting persistence in the sediment compartment with a new automatic software based on the k-Nearest Neighbor (k-NN) algorithm.

    PubMed

    Manganaro, Alberto; Pizzo, Fabiola; Lombardo, Anna; Pogliaghi, Alberto; Benfenati, Emilio

    2016-02-01

    The ability of a substance to resist degradation and persist in the environment needs to be readily identified in order to protect the environment and human health. Many regulations require the assessment of persistence for substances commonly manufactured and marketed. Besides laboratory-based testing methods, in silico tools may be used to obtain a computational prediction of persistence. We present a new program to develop k-Nearest Neighbor (k-NN) models. The k-NN algorithm is a similarity-based approach that predicts the property of a substance in relation to the experimental data for its most similar compounds. We employed this software to identify persistence in the sediment compartment. Data on half-life (HL) in sediment were obtained from different sources and, after careful data pruning the final dataset, containing 297 organic compounds, was divided into four experimental classes. We developed several models giving satisfactory performances, considering that both the training and test set accuracy ranged between 0.90 and 0.96. We finally selected one model which will be made available in the near future in the freely available software platform VEGA. This model offers a valuable in silico tool that may be really useful for fast and inexpensive screening.

  1. Medical diagnosis of atherosclerosis from Carotid Artery Doppler Signals using principal component analysis (PCA), k-NN based weighting pre-processing and Artificial Immune Recognition System (AIRS).

    PubMed

    Latifoğlu, Fatma; Polat, Kemal; Kara, Sadik; Güneş, Salih

    2008-02-01

    In this study, we proposed a new medical diagnosis system based on principal component analysis (PCA), k-NN based weighting pre-processing, and Artificial Immune Recognition System (AIRS) for diagnosis of atherosclerosis from Carotid Artery Doppler Signals. The suggested system consists of four stages. First, in the feature extraction stage, we have obtained the features related with atherosclerosis disease using Fast Fourier Transformation (FFT) modeling and by calculating of maximum frequency envelope of sonograms. Second, in the dimensionality reduction stage, the 61 features of atherosclerosis disease have been reduced to 4 features using PCA. Third, in the pre-processing stage, we have weighted these 4 features using different values of k in a new weighting scheme based on k-NN based weighting pre-processing. Finally, in the classification stage, AIRS classifier has been used to classify subjects as healthy or having atherosclerosis. Hundred percent of classification accuracy has been obtained by the proposed system using 10-fold cross validation. This success shows that the proposed system is a robust and effective system in diagnosis of atherosclerosis disease.

  2. An expert system based on principal component analysis, artificial immune system and fuzzy k-NN for diagnosis of valvular heart diseases.

    PubMed

    Sengur, Abdulkadir

    2008-03-01

    In the last two decades, the use of artificial intelligence methods in medical analysis is increasing. This is mainly because the effectiveness of classification and detection systems have improved a great deal to help the medical experts in diagnosing. In this work, we investigate the use of principal component analysis (PCA), artificial immune system (AIS) and fuzzy k-NN to determine the normal and abnormal heart valves from the Doppler heart sounds. The proposed heart valve disorder detection system is composed of three stages. The first stage is the pre-processing stage. Filtering, normalization and white de-noising are the processes that were used in this stage. The feature extraction is the second stage. During feature extraction stage, wavelet packet decomposition was used. As a next step, wavelet entropy was considered as features. For reducing the complexity of the system, PCA was used for feature reduction. In the classification stage, AIS and fuzzy k-NN were used. To evaluate the performance of the proposed methodology, a comparative study is realized by using a data set containing 215 samples. The validation of the proposed method is measured by using the sensitivity and specificity parameters; 95.9% sensitivity and 96% specificity rate was obtained.

  3. Metrics correlation and analysis service (MCAS)

    SciTech Connect

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya; /Fermilab

    2009-05-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  4. Metrics correlation and analysis service (MCAS)

    NASA Astrophysics Data System (ADS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-04-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  5. Separable metrics and radiating stars

    NASA Astrophysics Data System (ADS)

    Abebe, G. Z.; Maharaj, S. D.

    2017-01-01

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space-time variables. The condition of separability on the metric functions yields several new exact solutions. A class of shear-free models is found which contains a linear equation of state and generalizes a previously obtained model. Four new shearing models are obtained; all the gravitational potentials can be written explicitly. A brief physical analysis indicates that the matter variables are well behaved.

  6. Thermodynamic Metrics and Optimal Paths

    SciTech Connect

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  7. The flexibility of optical metrics

    NASA Astrophysics Data System (ADS)

    Bittencourt, Eduardo; Pereira, Jonas P.; Smolyaninov, Igor I.; Smolyaninova, Vera N.

    2016-08-01

    We firstly revisit the importance, naturalness and limitations of the so-called optical metrics for describing the propagation of light rays in the limit of geometric optics. We then exemplify their flexibility and nontriviality in some nonlinear material media and in the context of nonlinear theories of the electromagnetism, both in the presence of curved backgrounds, where optical metrics could be flat and inaccessible regions for the propagation of photons could be conceived, respectively. Finally, we underline and discuss the relevance and potential applications of our analyses in a broad sense, ranging from material media to compact astrophysical systems.

  8. Evaluating Descriptive Metrics of the Human Cone Mosaic

    PubMed Central

    Cooper, Robert F.; Wilk, Melissa A.; Tarima, Sergey; Carroll, Joseph

    2016-01-01

    Purpose To evaluate how metrics used to describe the cone mosaic change in response to simulated photoreceptor undersampling (i.e., cell loss or misidentification). Methods Using an adaptive optics ophthalmoscope, we acquired images of the cone mosaic from the center of fixation to 10° along the temporal, superior, inferior, and nasal meridians in 20 healthy subjects. Regions of interest (n = 1780) were extracted at regular intervals along each meridian. Cone mosaic geometry was assessed using a variety of metrics − density, density recovery profile distance (DRPD), nearest neighbor distance (NND), intercell distance (ICD), farthest neighbor distance (FND), percentage of six-sided Voronoi cells, nearest neighbor regularity (NNR), number of neighbors regularity (NoNR), and Voronoi cell area regularity (VCAR). The “performance” of each metric was evaluated by determining the level of simulated loss necessary to obtain 80% statistical power. Results Of the metrics assessed, NND and DRPD were the least sensitive to undersampling, classifying mosaics that lost 50% of their coordinates as indistinguishable from normal. The NoNR was the most sensitive, detecting a significant deviation from normal with only a 10% cell loss. Conclusions The robustness of cone spacing metrics makes them unsuitable for reliably detecting small deviations from normal or for tracking small changes in the mosaic over time. In contrast, regularity metrics are more sensitive to diffuse loss and, therefore, better suited for detecting such changes, provided the fraction of misidentified cells is minimal. Combining metrics with a variety of sensitivities may provide a more complete picture of the integrity of the photoreceptor mosaic. PMID:27273598

  9. Quality Assessment of Sharpened Images: Challenges, Methodology, and Objective Metrics.

    PubMed

    Krasula, Lukas; Le Callet, Patrick; Fliegel, Karel; Klima, Milos

    2017-01-10

    Most of the effort in image quality assessment (QA) has been so far dedicated to the degradation of the image. However, there are also many algorithms in the image processing chain that can enhance the quality of an input image. These include procedures for contrast enhancement, deblurring, sharpening, up-sampling, denoising, transfer function compensation, etc. In this work, possible strategies for the quality assessment of sharpened images are investigated. This task is not trivial because the sharpening techniques can increase the perceived quality, as well as introduce artifacts leading to the quality drop (over-sharpening). Here, the framework specifically adapted for the quality assessment of sharpened images and objective metrics comparison in this context is introduced. However, the framework can be adopted in other quality assessment areas as well. The problem of selecting the correct procedure for subjective evaluation was addressed and a subjective test on blurred, sharpened, and over-sharpened images was performed in order to demonstrate the use of the framework. The obtained ground-truth data were used for testing the suitability of state-ofthe- art objective quality metrics for the assessment of sharpened images. The comparison was performed by novel procedure using ROC analyses which is found more appropriate for the task than standard methods. Furthermore, seven possible augmentations of the no-reference S3 metric adapted for sharpened images are proposed. The performance of the metric is significantly improved and also superior over the rest of the tested quality criteria with respect to the subjective data.

  10. Trajectory Planning by Preserving Flexibility: Metrics and Analysis

    NASA Technical Reports Server (NTRS)

    Idris, Husni R.; El-Wakil, Tarek; Wing, David J.

    2008-01-01

    In order to support traffic management functions, such as mitigating traffic complexity, ground and airborne systems may benefit from preserving or optimizing trajectory flexibility. To help support this hypothesis trajectory flexibility metrics have been defined in previous work to represent the trajectory robustness and adaptability to the risk of violating safety and traffic management constraints. In this paper these metrics are instantiated in the case of planning a trajectory with the heading degree of freedom. A metric estimation method is presented based on simplifying assumptions, namely discrete time and heading maneuvers. A case is analyzed to demonstrate the estimation method and its use in trajectory planning in a situation involving meeting a time constraint and avoiding loss of separation with nearby traffic. The case involves comparing path-stretch trajectories, in terms of adaptability and robustness along each, deduced from a map of estimated flexibility metrics over the solution space. The case demonstrated anecdotally that preserving flexibility may result in enhancing certain factors that contribute to traffic complexity, namely reducing proximity and confrontation.

  11. Advanced metrics for network-centric naval operations

    NASA Astrophysics Data System (ADS)

    Perry, Walter L.; Bowden, Fred D. J.

    2003-07-01

    Defense organizations around the world are formulating new visions, strategies, and concepts that utilize emerging information-age technologies. Central among these is network-based operations. Measures and metrics are needed that allow analysts to link the effects of alternative network structures, operating procedures and command and control arrangements to combat outcomes. This paper reports on measures and mathematical metrics that begin to address this problem. Networks are assessed in terms of their complexity, their ability to adapt, and the collaboration opportunity they afford. The metrics measure the contributions of complexity to information flow, and the deleterious effects of information overload and disconfirming reports to overall network performance. In addition, they measure the contributions of collaboration to shared situational awareness in terms of the accuracy and precision of the information produced and the costs associated with an imbalance of the two. We posit a fixed network connecting a Naval Task Force"s various platforms, and assess the ability of this network to support the range of missions required of the task force. The emphasis is not on connectivity, but rather on information flow and how well the network is able to adapt to alternative flow requirements. We assess the impact alternative network structures, operating procedures and command arrangements have on combat outcomes by applying the metrics to a cruise missile defense scenario.

  12. Guidelines for Teaching Metric Concepts.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    The primary purpose of these guidelines is to provide teachers and other decision-makers with a suggested framework within which sound planning for metric education can be done. Student behavioral objectives are listed by topic. Each objective is coded to indicate grade level, topic, and objective number. A chart is provided to show a kindergarten…

  13. Metrical Phonology in Speech Production.

    ERIC Educational Resources Information Center

    Cooper, William E.; Eady, Stephen J.

    1986-01-01

    Describes several experiments which examined the basic claims of metrical phonology. The first two experiments examined the possible influences of stress clash in speech timing. The third and fourth experiments tested Hayes's (1984) analysis rule of quadrisyllabic meter; the fifth experiment included a basic test of the stress clash notion. (SED)

  14. Leading Gainful Employment Metric Reporting

    ERIC Educational Resources Information Center

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  15. Powerful Metrics: Strategic and Transformative

    ERIC Educational Resources Information Center

    Butterfield, Barbara

    2006-01-01

    To be a valuable partner at the strategic level, human resources can and should contribute to both institutional effectiveness measurement and workforce metrics. In this article, the author examines how to link HR initiatives with key institutional strategies, clarifies essential HR responsibilities for workforce results, explores return on human…

  16. Metric Units in Primary Schools.

    ERIC Educational Resources Information Center

    Lighthill, M. J.; And Others

    Although this pamphlet is intended as background material for teachers in English primary schools changing to the System International d'Unites (SI units), the form of the metric system being adopted by the United Kingdom, the educational implications of the change and the lists of apparatus suitable for use with children up to 14 years of age are…

  17. Improving an Imperfect Metric System

    ERIC Educational Resources Information Center

    Frasier, E. Lewis

    1974-01-01

    Suggests some improvements and additional units necessary for the International Metric System to expand its use to all measureable entities and defined quantities, especially in the measurement of time and angles. Included are tables of proposed unit systems in contrast with the presently available systems. (CC)

  18. Measuring in Metric. Student Version.

    ERIC Educational Resources Information Center

    Hawaii Univ., Honolulu. Curriculum Research and Development Group.

    This supplementary mathematics textbook is designed to teach pupils in the intermediate grades about the vocabulary and use of the metric system of measurement. The book limits exploration to the units of measurement for length, capacity, mass, temperature, area, and volume, with only the following prefixes considered: kilo, hecto, deka, deci,…

  19. Leading Gainful Employment Metric Reporting

    ERIC Educational Resources Information Center

    Powers, Kristina; MacPherson, Derek

    2016-01-01

    This chapter will address the importance of intercampus involvement in reporting of gainful employment student-level data that will be used in the calculation of gainful employment metrics by the U.S. Department of Education. The authors will discuss why building relationships within the institution is critical for effective gainful employment…

  20. Metrication in the Construction Industry

    ERIC Educational Resources Information Center

    Linden, Carl Vander

    1974-01-01

    Three groups appear to be in a position to lead the construction industry in adopting metrication: the building materials manufacturers industry associations, the architectural community, and the building code and standards organization. (A paper presented at Building Research Institute conference, Washington, D.C., November 27, 1973.) (Author/MLF)

  1. Guidelines for Teaching Metric Concepts.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    The primary purpose of these guidelines is to provide teachers and other decision-makers with a suggested framework within which sound planning for metric education can be done. Student behavioral objectives are listed by topic. Each objective is coded to indicate grade level, topic, and objective number. A chart is provided to show a kindergarten…

  2. Metric-Free Distributional Comparisons.

    ERIC Educational Resources Information Center

    Haertel, Edward H.; And Others

    Two methods are presented for comparing distributions, such as achievement test score distributions, for distinctly different groups of persons in such a way that the comparison will not be influenced by the particular metric of the test being used. Both methods use percentile scores. One method, attributed to Flanagan, fits a straight line to the…

  3. Metrication and the Technical Teacher

    ERIC Educational Resources Information Center

    Irving, Michael

    1975-01-01

    The conclusion of the two-part feature on the S1 metric (International System of Units) reviews the basics and some of the rules technical teachers need to know in order to prepare their students for the changing world. (Author)

  4. Metric Measurement: Grades K-8.

    ERIC Educational Resources Information Center

    Instructional Objectives Exchange, Los Angeles, CA.

    This collection is comprised of 63 objectives and corresponding sample test items for evaluation of students in grades K-8. Correct answers or criteria for judging the adequacy of student responses are provided. Major categories in the collection are: (1) preparing to use the metric system--decimal and fractional notation; (2) measurement--length,…

  5. Improving an Imperfect Metric System

    ERIC Educational Resources Information Center

    Frasier, E. Lewis

    1974-01-01

    Suggests some improvements and additional units necessary for the International Metric System to expand its use to all measureable entities and defined quantities, especially in the measurement of time and angles. Included are tables of proposed unit systems in contrast with the presently available systems. (CC)

  6. Metrics in Education - Resource Materials.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Div. of Curriculum Development.

    This publication contains materials suitable for reproduction as transparencies or as classroom handouts. These metric materials may be used in a variety of occupational and practical arts courses. The format of the materials is in large print, some with humorous drawing; details of drawings and charts are easy to read. Introductory pages deal…

  7. Adaptive edge histogram descriptor for landmine detection using GPR

    NASA Astrophysics Data System (ADS)

    Frigui, Hichem; Fadeev, Aleksey; Karem, Andrew; Gader, Paul

    2009-05-01

    The Edge Histogram Detector (EHD) is a landmine detection algorithm for sensor data generated by ground penetrating radar (GPR). It uses edge histograms for feature extraction and a possibilistic K-Nearest Neighbors (K-NN) rule for confidence assignment. To reduce the computational complexity of the EHD and improve its generalization, the K-NN classifier uses few prototypes that can capture the variations of the signatures within each class. Each of these prototypes is assigned a label in the class of mines and a label in the class of clutter to capture its degree of sharing among these classes. The EHD has been tested extensively. It has demonstrated excellent performance on large real world data sets, and has been implemented in real time versions in hand-held and vehicle mounted GPR. In this paper, we propose two modifications to the EHD to improve its performance and adaptability. First, instead of using a fixed threshold to decide if the edge at a certain location is strong enough, we use an adaptive threshold that is learned from the background surrounding the target. This modification makes the EHD more adaptive to different terrains and to mines buried at different depths. Second, we introduce an additional training component that tunes the prototype features and labels to different environments. Results on large and diverse GPR data collections show that the proposed adaptive EHD outperforms the baseline EHD. We also show that the edge threshold can vary significantly according to the edge type, alarm depth, and soil conditions.

  8. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  9. A k-NN METHOD TO CLASSIFY RARE ASTRONOMICAL SOURCES: PHOTOMETRIC SEARCH OF BROWN DWARFS WITH SPITZER/IRAC

    SciTech Connect

    Marengo, Massimo; Sanchez, Mayly C. E-mail: mayly.sanchez@anl.gov

    2009-07-15

    We present a statistical method for the photometric search of rare astronomical sources based on the weighted k-Nearest Neighbors method. A metric is defined in a multidimensional color-magnitude space based only on the photometric properties of template sources and the photometric uncertainties of both templates and data, without the need to define ad hoc color and magnitude cuts which could bias the search. The metric is defined as a function of two parameters, the number of neighbors k and a threshold distance D {sub th} that can be optimized for maximum selection efficiency and completeness. We apply the method to the search of L and T dwarfs in the Spitzer Extragalactic First Look Survey and the Booetes field of the Spitzer Shallow Survey, as well as to the search of substellar mass companions around nearby stars. With high level of completeness, we confirm the absence of late-T dwarfs detected in at least two bands in the First Look Survey, and only one in the Shallow Survey (previously discovered by Stern et al.). This result is in agreement with the expected statistics for late-T dwarfs. One L/early-T candidate is found in the First Look Survey, and three in the Shallow Surveys, currently undergoing follow-up spectroscopic verification. Finally, we discuss the potential for brown dwarf searches with this method in the Spitzer warm mission Exploration Science programs.

  10. Hi Metric Fans! We're the Metric Mice. Elementary Metric Project Awareness Booklet.

    ERIC Educational Resources Information Center

    Huber, Roland B.

    An overview and samples of some of the materials contained in the Elementary Metric Project's curriculum for grades 1-6 are presented. Information is given concerning the adoption of the program. The sample activity sheets deal with linear measurement, mass, volume, and temperature. (MP)

  11. Summer temperature metrics for predicting brook trout (Salvelinus fontinalis) distribution in streams

    USGS Publications Warehouse

    Parrish, Donna; Butryn, Ryan S.; Rizzo, Donna M.

    2012-01-01

    We developed a methodology to predict brook trout (Salvelinus fontinalis) distribution using summer temperature metrics as predictor variables. Our analysis used long-term fish and hourly water temperature data from the Dog River, Vermont (USA). Commonly used metrics (e.g., mean, maximum, maximum 7-day maximum) tend to smooth the data so information on temperature variation is lost. Therefore, we developed a new set of metrics (called event metrics) to capture temperature variation by describing the frequency, area, duration, and magnitude of events that exceeded a user-defined temperature threshold. We used 16, 18, 20, and 22°C. We built linear discriminant models and tested and compared the event metrics against the commonly used metrics. Correct classification of the observations was 66% with event metrics and 87% with commonly used metrics. However, combined event and commonly used metrics correctly classified 92%. Of the four individual temperature thresholds, it was difficult to assess which threshold had the “best” accuracy. The 16°C threshold had slightly fewer misclassifications; however, the 20°C threshold had the fewest extreme misclassifications. Our method leveraged the volumes of existing long-term data and provided a simple, systematic, and adaptable framework for monitoring changes in fish distribution, specifically in the case of irregular, extreme temperature events.

  12. Measure Metric: A Multi-State Consortium

    ERIC Educational Resources Information Center

    Dowling, Kenneth W.

    1977-01-01

    Describes the "Measure Metric" series of twelve fifteen-minute programs and related classroom materials for grades 5 and 6 for teaching the metric system and the International System of Units (SI). (SL)

  13. Hands-On Activities with Metrics

    ERIC Educational Resources Information Center

    McFee, Evan

    1978-01-01

    Suggestions for familiarizing elementary teachers with the use of the metric system are given. These include a "stair-steps" method of converting units within the metric system and estimation and measurement activities using familiar everyday objects. (MN)

  14. Do Your Students Measure Up Metrically?

    ERIC Educational Resources Information Center

    Taylor, P. Mark; Simms, Ken; Kim, Ok-Kyeong; Reys, Robert E.

    2001-01-01

    Examines released metric items from the Third International Mathematics and Science Study (TIMSS) and the 3rd and 4th grade results. Recommends refocusing instruction on the metric system to improve student performance in measurement. (KHR)

  15. Think Metric: It's All Around Us.

    ERIC Educational Resources Information Center

    Learning, 1988

    1988-01-01

    Class activities designed to instruct young students in the use of the metric system of measurement are illustrated on a larger poster. Ideas for measuring, thinking and writing in metric are presented for children of different ages. (JD)

  16. Measuring Sustainability: Deriving Metrics From Objectives (Presentation)

    EPA Science Inventory

    The definition of 'sustain', to keep in existence, provides some insight into the metrics that are required to measure sustainability and adequately respond to assure sustainability. Keeping something in existence implies temporal and spatial contexts and requires metrics that g...

  17. Measuring Sustainability: Deriving Metrics From Objectives (Presentation)

    EPA Science Inventory

    The definition of 'sustain', to keep in existence, provides some insight into the metrics that are required to measure sustainability and adequately respond to assure sustainability. Keeping something in existence implies temporal and spatial contexts and requires metrics that g...

  18. Fuzzy polynucleotide spaces and metrics.

    PubMed

    Nieto, Juan J; Torres, A; Georgiou, D N; Karakasidis, T E

    2006-04-01

    The study of genetic sequences is of great importance in biology and medicine. Mathematics is playing an important role in the study of genetic sequences and, generally, in bioinformatics. In this paper, we extend the work concerning the Fuzzy Polynucleotide Space (FPS) introduced in Torres, A., Nieto, J.J., 2003. The fuzzy polynucleotide Space: Basic properties. Bioinformatics 19(5); 587-592 and Nieto, J.J., Torres, A., Vazquez-Trasande, M.M. 2003. A metric space to study differences between polynucleotides. Appl. Math. Lett. 27:1289-1294: by studying distances between nucleotides and some complete genomes using several metrics. We also present new results concerning the notions of similarity, difference and equality between polynucleotides. The results are encouraging since they demonstrate how the notions of distance and similarity between polynucleotides in the FPS can be employed in the analysis of genetic material.

  19. Metrics for Labeled Markov Systems

    NASA Technical Reports Server (NTRS)

    Desharnais, Josee; Jagadeesan, Radha; Gupta, Vineet; Panangaden, Prakash

    1999-01-01

    Partial Labeled Markov Chains are simultaneously generalizations of process algebra and of traditional Markov chains. They provide a foundation for interacting discrete probabilistic systems, the interaction being synchronization on labels as in process algebra. Existing notions of process equivalence are too sensitive to the exact probabilities of various transitions. This paper addresses contextual reasoning principles for reasoning about more robust notions of "approximate" equivalence between concurrent interacting probabilistic systems. The present results indicate that:We develop a family of metrics between partial labeled Markov chains to formalize the notion of distance between processes. We show that processes at distance zero are bisimilar. We describe a decision procedure to compute the distance between two processes. We show that reasoning about approximate equivalence can be done compositionally by showing that process combinators do not increase distance. We introduce an asymptotic metric to capture asymptotic properties of Markov chains; and show that parallel composition does not increase asymptotic distance.

  20. Health Metrics for Helminth infections

    PubMed Central

    2014-01-01

    Health metrics based on health-adjusted life years have become standard units for comparing the disease burden and treatment benefits of individual health conditions. The Disability-Adjusted Life Year (DALY) and the Quality-Adjusted Life Year (QALY) are the most frequently used in cost-effect analyses in national and global health policy discussions for allocation of health care resources. While sometimes useful, both the DALY and QALY metrics have limitations in their ability to capture the full health impact of helminth infections and other ‘neglected tropical diseases’ (NTDs). Gaps in current knowledge of disease burden are identified, and interim approaches to disease burden assessment are discussed. PMID:24333545

  1. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  2. Metric reconstruction from Weyl scalars

    NASA Astrophysics Data System (ADS)

    Whiting, Bernard F.; Price, Larry R.

    2005-08-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources—which are essential when the emitting masses are considered—and the failure to describe the ell = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  3. A stationary q-metric

    NASA Astrophysics Data System (ADS)

    Toktarbay, S.; Quevedo, H.

    2014-10-01

    We present a stationary generalization of the static $q-$metric, the simplest generalization of the Schwarzschild solution that contains a quadrupole parameter. It possesses three independent parameters that are related to the mass, quadrupole moment and angular momentum. We investigate the geometric and physical properties of this exact solution of Einstein's vacuum equations, and show that it can be used to describe the exterior gravitational field of rotating, axially symmetric, compact objects.

  4. Marketing metrics for medical practices.

    PubMed

    Zahaluk, David; Baum, Neil

    2012-01-01

    There's a saying by John Wanamaker who pontificated, "Half the money I spend on advertising is wasted; the trouble is, I don't know which half". Today you have opportunities to determine which parts of your marketing efforts are effective and what is wasted. However, you have to measure your marketing results. This article will discuss marketing metrics and how to use them to get the best bang for your marketing buck.

  5. Multi-Metric Sustainability Analysis

    SciTech Connect

    Cowlin, Shannon; Heimiller, Donna; Macknick, Jordan; Mann, Margaret; Pless, Jacquelyn; Munoz, David

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  6. Science and Technology Transition Metrics

    DTIC Science & Technology

    2004-03-01

    for Research", Science, Volume 277, 1 August 1997. Kostoff, R. N., "The Use and Misuse of Citation Analysis in Research Evaluation", Scientometrics ...The Metrics of Science and Technology”. Scientometrics . 50:2. 353-361. February 2001. Kostoff, R. N., and Schaller, R. R. "Science and...37. 604-606. September 2001. Kostoff, R. N. “Citation Analysis for Research Performer Quality”. Scientometrics . 53:1. 49-71. 2002. VII

  7. Sensory Metrics of Neuromechanical Trust.

    PubMed

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  8. SubChlo: predicting protein subchloroplast locations with pseudo-amino acid composition and the evidence-theoretic K-nearest neighbor (ET-KNN) algorithm.

    PubMed

    Du, Pufeng; Cao, Shengjiao; Li, Yanda

    2009-11-21

    The chloroplast is a type of plant specific subcellular organelle. It is of central importance in several biological processes like photosynthesis and amino acid biosynthesis. Thus, understanding the function of chloroplast proteins is of significant value. Since the function of chloroplast proteins correlates with their subchloroplast locations, the knowledge of their subchloroplast locations can be very helpful in understanding their role in the biological processes. In the current paper, by introducing the evidence-theoretic K-nearest neighbor (ET-KNN) algorithm, we developed a method for predicting the protein subchloroplast locations. This is the first algorithm for predicting the protein subchloroplast locations. We have implemented our algorithm as an online service, SubChlo (http://bioinfo.au.tsinghua.edu.cn/subchlo). This service may be useful to the chloroplast proteome research.

  9. Visual metrics: discriminative power through flexibility.

    PubMed

    Janssen, T J; Blommaert, F J

    2000-01-01

    An important stage in visual processing is the quantification of optical attributes of the outside world. We argue that the metrics used for this quantification are flexible, and that this flexibility is exploited to optimise the discriminative power of the metrics. We derive mathematical expressions for such optimal metrics and show that they exhibit properties resembling well-known visual phenomena. To conclude, we discuss some of the implications of flexible metrics for visual identification.

  10. A normalized Levenshtein distance metric.

    PubMed

    Yujian, Li; Bo, Liu

    2007-06-01

    Although a number of normalized edit distances presented so far may offer good performance in some applications, none of them can be regarded as a genuine metric between strings because they do not satisfy the triangle inequality. Given two strings X and Y over a finite alphabet, this paper defines a new normalized edit distance between X and Y as a simple function of their lengths (|X| and |Y|) and the Generalized Levenshtein Distance (GLD) between them. The new distance can be easily computed through GLD with a complexity of O(|X|.|Y|) and it is a metric valued in [0, 1] under the condition that the weight function is a metric over the set of elementary edit operations with all costs of insertions/deletions having the same weight. Experiments using the AESA algorithm in handwritten digit recognition show that the new distance can generally provide similar results to some other normalized edit distances and may perform slightly better if the triangle inequality is violated in a particular data set.

  11. Quasi-Einstein metrics on hypersurface families

    NASA Astrophysics Data System (ADS)

    Hall, Stuart James

    2013-02-01

    We construct quasi-Einstein metrics on some hypersurface families. The hypersurfaces are circle bundles over the product of Fano, Kähler-Einstein manifolds. The quasi-Einstein metrics are related to various gradient Kähler-Ricci solitons constructed by Dancer and Wang and some Hermitian, non-Kähler, Einstein metrics constructed by Wang and Wang on the same manifolds.

  12. Metrics for Evaluation of Student Models

    ERIC Educational Resources Information Center

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  13. Metrics for Evaluation of Student Models

    ERIC Educational Resources Information Center

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  14. Properties of C-metric spaces

    NASA Astrophysics Data System (ADS)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  15. Hybrid metric-Palatini stars

    NASA Astrophysics Data System (ADS)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  16. Underwater target classification in changing environments using adaptive feature mapping schemes

    NASA Astrophysics Data System (ADS)

    Yao, De; Azimi-Sadjadi, Mahmood R.; Li, Donghui; Dobeck, Gerald J.

    2000-08-01

    A new adaptive feature mapping scheme is presented in this paper to cope with environmental and target signature changes in underwater target classification. A wavelet packet-based feature extraction scheme is used in conjunction with the linear prediction coding (LPC) scheme as the front-end processor. The core of the adaptive classification system is the adaptive feature mapping sub- system that minimizes the classification error of the classifier. The extracted feature vector is mapped by the resultant feature mapping matrix in such a way that the mapped version remains invariant to the environmental and sensory changes. The feedback to the adaptation mechanism is provided by a K-nearest neighbor (K-NN) classifier. In order to alleviate problems caused by poorly scaled features, a revised K-NN based on the scaled Euclidean distance was adopted. Two error criteria were used in the adaptive system, one is the least squares (LS) error criterion and the other is 2D sigmoid cost function. Those two criteria were combined together to offer a better performance. The test results on 40KHz sigmoid cost function. Those two criteria were combined together to offer a better performance. The test result on 40KHz linear FM acoustic backscattered data collected for six different objects are presented. The effectiveness of the adaptive system vs. non- adaptive system is demonstrated when the signal-to- reverberation ratio in the environment is varying.

  17. Considerations in Physiological Metric Selection for Online Detection of Operator State: A Case Study

    DTIC Science & Technology

    2016-07-17

    blood flow velocity (CBFV) [22], electroencephalography ( EEG ) [23], and eye tracking metrics [24]. Prominent dual process theories have proposed that...assessments to gauge fa- tigue in adaptive automation systems has focused on EEG metrics and event-related potential (ERP) analysis [26]. With the...onset of fatigue, EEG registers relatively relia- ble increases in slow wave activity, related to drowsiness and sleep, and alpha wave activity

  18. A new distribution metric for image segmentation

    NASA Astrophysics Data System (ADS)

    Sandhu, Romeil; Georgiou, Tryphon; Tannenbaum, Allen

    2008-03-01

    In this paper, we present a new distribution metric for image segmentation that arises as a result in prediction theory. Forming a natural geodesic, our metric quantifies "distance" for two density functionals as the standard deviation of the difference between logarithms of those distributions. Using level set methods, we incorporate an energy model based on the metric into the Geometric Active Contour framework. Moreover, we briefly provide a theoretical comparison between the popular Fisher Information metric, from which the Bhattacharyya distance originates, with the newly proposed similarity metric. In doing so, we demonstrate that segmentation results are directly impacted by the type of metric used. Specifically, we qualitatively compare the Bhattacharyya distance and our algorithm on the Kaposi Sarcoma, a pathology that infects the skin. We also demonstrate the algorithm on several challenging medical images, which further ensure the viability of the metric in the context of image segmentation.

  19. Texture metric that predicts target detection performance

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.

    2015-12-01

    Two texture metrics based on gray level co-occurrence error (GLCE) are used to predict probability of detection and mean search time. The two texture metrics are local clutter metrics and are based on the statistics of GLCE probability distributions. The degree of correlation between various clutter metrics and the target detection performance of the nine military vehicles in complex natural scenes found in the Search_2 dataset are presented. Comparison is also made between four other common clutter metrics found in the literature: root sum of squares, Doyle, statistical variance, and target structure similarity. The experimental results show that the GLCE energy metric is a better predictor of target detection performance when searching for targets in natural scenes than the other clutter metrics studied.

  20. Rapid cytometric antibiotic susceptibility testing utilizing adaptive multidimensional statistical metrics.

    PubMed

    Huang, Tzu-Hsueh; Ning, Xinghai; Wang, Xiaojian; Murthy, Niren; Tzeng, Yih-Ling; Dickson, Robert M

    2015-02-03

    Flow cytometry holds promise to accelerate antibiotic susceptibility determinations; however, without robust multidimensional statistical analysis, general discrimination criteria have remained elusive. In this study, a new statistical method, probability binning signature quadratic form (PB-sQF), was developed and applied to analyze flow cytometric data of bacterial responses to antibiotic exposure. Both sensitive lab strains (Escherichia coli and Pseudomonas aeruginosa) and a multidrug resistant, clinically isolated strain (E. coli) were incubated with the bacteria-targeted dye, maltohexaose-conjugated IR786, and each of many bactericidal or bacteriostatic antibiotics to identify changes induced around corresponding minimum inhibition concentrations (MIC). The antibiotic-induced damages were monitored by flow cytometry after 1-h incubation through forward scatter, side scatter, and fluorescence channels. The 3-dimensional differences between the flow cytometric data of the no-antibiotic treated bacteria and the antibiotic-treated bacteria were characterized by PB-sQF into a 1-dimensional linear distance. A 99% confidence level was established by statistical bootstrapping for each antibiotic-bacteria pair. For the susceptible E. coli strain, statistically significant increments from this 99% confidence level were observed from 1/16x MIC to 1x MIC for all the antibiotics. The same increments were recorded for P. aeruginosa, which has been reported to cause difficulty in flow-based viability tests. For the multidrug resistant E. coli, significant distances from control samples were observed only when an effective antibiotic treatment was utilized. Our results suggest that a rapid and robust antimicrobial susceptibility test (AST) can be constructed by statistically characterizing the differences between sample and control flow cytometric populations, even in a label-free scheme with scattered light alone. These distances vs paired controls coupled with rigorous statistical confidence limits offer a new path toward investigating initial biological responses, screening for drugs, and shortening time to result in antimicrobial sensitivity testing.

  1. Case-Based Behavior Adaptation Using an Inverse Trust Metric

    DTIC Science & Technology

    2014-06-01

    NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...wheeled unmanned ground vehicle (UGV) and uses eBotwork’s built-in natural language processing (for interpreting user commands), locomo- tion, and path...Karray, F.; and Morckos, M. 2012. Mod- elling of robot attention demand in human-robot inter- action using finite fuzzy state automata . In Interna

  2. Adapting Dyna-METRIC to Assess Non-Aircraft Systems.

    DTIC Science & Technology

    1986-05-01

    geographically separated TACS units deployed throughout Germany would improve the supply support and 17 capability of the units. To accomplish this they...value in the "ordered" segement of the pipeline, which is an incorrect and undesireable result. Once again, AFLC/XRS investigated the problem and could

  3. Achieving effective landscape conservation: evolving demands adaptive metrics

    USDA-ARS?s Scientific Manuscript database

    Rapid changes in demographics and on- and off-farm land use limit the impacts of U.S. conservation programs and present particular challenges to future conservation efforts. The fragmentation of landscape through urban, suburban, and peri-urban development, coincident with demographic shifts, has s...

  4. Metric theory of nematoelastic shells.

    PubMed

    Pismen, L M

    2014-12-01

    We consider three-dimensional reshaping of a thin nematoelastic film upon nematic-isotropic transition in the field of a charge one topological defect, leading to either cone or anticone (d-cone) shells. The analysis is based on the relation between the shell metric and the tensor order parameter under the assumption of no elastic deformation and volume change. The shape of the shell can be modified by doping, creating cones with curved generatrices. Anticones necessarily have an even number of radial creases. The curvature singularity at the apex is resolved due to decay of the nematic order parameter at the defect core.

  5. Product Operations Status Summary Metrics

    NASA Technical Reports Server (NTRS)

    Takagi, Atsuya; Toole, Nicholas

    2010-01-01

    The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.

  6. Comparing Resource Adequacy Metrics: Preprint

    SciTech Connect

    Ibanez, E.; Milligan, M.

    2014-09-01

    As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

  7. Elliptic constructions of hyperkahler metrics

    NASA Astrophysics Data System (ADS)

    Ionas, Radu Aurelian

    In this dissertation we develop a twistor-theoretic method of constructing hyperkahler metrics from holomorphic functions and elliptic curves. We obtain, among other things, new results concerning the Atiyah-Hitchin manifold, asymptotically locally Euclidean spaces of type Dn and certain Swann bundles. For example, in the Atiyah-Hitchin case we derive in an explicit holomorphic coordinate basis closed-form formulas for the metric, the holomorphic symplectic form and all three Kahler potentials. The equation describing an asymptotically locally Euclidean space of type Dn is found to admit an algebraic formulation in terms of the group law on a Weierstrass cubic. This curve has the structure of a Cayley cubic for a pencil generated by two transversal plane conics, that is, it takes the form Y2 = det( A+XB ), where A and B are the defining 3 x 3 matrices of the conics. In this light, the equation can be interpreted as the closure condition for an elliptic billiard trajectory tangent to the conic B and bouncing into various conics of the pencil determined by the positions of the monopoles.

  8. A family of heavenly metrics

    NASA Astrophysics Data System (ADS)

    Nutku, Y.; Sheftel, M. B.

    2014-02-01

    This is a corrected and essentially extended version of the unpublished manuscript by Y Nutku and M Sheftel which contains new results. It is proposed to be published in honour of Y Nutku’s memory. All corrections and new results in sections 1, 2 and 4 are due to M Sheftel. We present new anti-self-dual exact solutions of the Einstein field equations with Euclidean and neutral (ultra-hyperbolic) signatures that admit only one rotational Killing vector. Such solutions of the Einstein field equations are determined by non-invariant solutions of Boyer-Finley (BF) equation. For the case of Euclidean signature such a solution of the BF equation was first constructed by Calderbank and Tod. Two years later, Martina, Sheftel and Winternitz applied the method of group foliation to the BF equation and reproduced the Calderbank-Tod solution together with new solutions for the neutral signature. In the case of Euclidean signature we obtain new metrics which asymptotically locally look like a flat space and have a non-removable singular point at the origin. In the case of ultra-hyperbolic signature there exist three inequivalent forms of metric. Only one of these can be obtained by analytic continuation from the Calderbank-Tod solution whereas the other two are new.

  9. Determining GPS average performance metrics

    NASA Technical Reports Server (NTRS)

    Moore, G. V.

    1995-01-01

    Analytic and semi-analytic methods are used to show that users of the GPS constellation can expect performance variations based on their location. Specifically, performance is shown to be a function of both altitude and latitude. These results stem from the fact that the GPS constellation is itself non-uniform. For example, GPS satellites are over four times as likely to be directly over Tierra del Fuego than over Hawaii or Singapore. Inevitable performance variations due to user location occur for ground, sea, air and space GPS users. These performance variations can be studied in an average relative sense. A semi-analytic tool which symmetrically allocates GPS satellite latitude belt dwell times among longitude points is used to compute average performance metrics. These metrics include average number of GPS vehicles visible, relative average accuracies in the radial, intrack and crosstrack (or radial, north/south, east/west) directions, and relative average PDOP or GDOP. The tool can be quickly changed to incorporate various user antenna obscuration models and various GPS constellation designs. Among other applications, tool results can be used in studies to: predict locations and geometries of best/worst case performance, design GPS constellations, determine optimal user antenna location and understand performance trends among various users.

  10. Determining GPS average performance metrics

    NASA Technical Reports Server (NTRS)

    Moore, G. V.

    1995-01-01

    Analytic and semi-analytic methods are used to show that users of the GPS constellation can expect performance variations based on their location. Specifically, performance is shown to be a function of both altitude and latitude. These results stem from the fact that the GPS constellation is itself non-uniform. For example, GPS satellites are over four times as likely to be directly over Tierra del Fuego than over Hawaii or Singapore. Inevitable performance variations due to user location occur for ground, sea, air and space GPS users. These performance variations can be studied in an average relative sense. A semi-analytic tool which symmetrically allocates GPS satellite latitude belt dwell times among longitude points is used to compute average performance metrics. These metrics include average number of GPS vehicles visible, relative average accuracies in the radial, intrack and crosstrack (or radial, north/south, east/west) directions, and relative average PDOP or GDOP. The tool can be quickly changed to incorporate various user antenna obscuration models and various GPS constellation designs. Among other applications, tool results can be used in studies to: predict locations and geometries of best/worst case performance, design GPS constellations, determine optimal user antenna location and understand performance trends among various users.

  11. Compressed Sensing for Metrics Development

    NASA Astrophysics Data System (ADS)

    McGraw, R. L.; Giangrande, S. E.; Liu, Y.

    2012-12-01

    Models by their very nature tend to be sparse in the sense that they are designed, with a few optimally selected key parameters, to provide simple yet faithful representations of a complex observational dataset or computer simulation output. This paper seeks to apply methods from compressed sensing (CS), a new area of applied mathematics currently undergoing a very rapid development (see for example Candes et al., 2006), to FASTER needs for new approaches to model evaluation and metrics development. The CS approach will be illustrated for a time series generated using a few-parameter (i.e. sparse) model. A seemingly incomplete set of measurements, taken at a just few random sampling times, is then used to recover the hidden model parameters. Remarkably there is a sharp transition in the number of required measurements, beyond which both the model parameters and time series are recovered exactly. Applications to data compression, data sampling/collection strategies, and to the development of metrics for model evaluation by comparison with observation (e.g. evaluation of model predictions of cloud fraction using cloud radar observations) are presented and discussed in context of the CS approach. Cited reference: Candes, E. J., Romberg, J., and Tao, T. (2006), Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on Information Theory, 52, 489-509.

  12. Metrics for building performance assurance

    SciTech Connect

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  13. Jacobi-Maupertuis-Eisenhart metric and geodesic flows

    NASA Astrophysics Data System (ADS)

    Chanda, Sumanto; Gibbons, G. W.; Guha, Partha

    2017-03-01

    The Jacobi metric derived from the line element by one of the authors is shown to reduce to the standard formulation in the non-relativistic approximation. We obtain the Jacobi metric for various stationary metrics. Finally, the Jacobi-Maupertuis metric is formulated for time-dependent metrics by including the Eisenhart-Duval lift, known as the Jacobi-Eisenhart metric.

  14. 48 CFR 611.002-70 - Metric system implementation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... item is designated, produced and described in inch-pound values with soft metric values also shown for... of measurement sensitive processes and systems to the metric system. Soft metric means the result of... possible. Alternatives to hard metric are soft, dual and hybrid metric terms. The Metric Handbook...

  15. Comparison of conservation metrics in a case study of lemurs.

    PubMed

    Gudde, Renske; Venditti, Chris

    2016-12-01

    Conservation planning is important to protect species from going extinct now that natural habitats are decreasing owing to human activity and climate change. However, there is considerable controversy in choosing appropriate metrics to weigh the value of species and geographic regions. For example, the added value of phylogenetic conservation-selection criteria remains disputed because high correlations between them and the nonphylogenetic criteria of species richness have been reported. We evaluated the commonly used conservation metrics species richness, endemism, phylogenetic diversity (PD), and phylogenetic endemism (PE) in a case study on lemurs of Madagascar. This enabled us to identify the conservation target of each metric and consider how they may be used in future conservation planning. We also devised a novel metric that uses a phylogeny scaled according to the rate of phenotypic evolution as a proxy for a species' ability to adapt to change. High rates of evolution may indicate generalization or specialization. Both specialization and low rates of evolution may result in an inability to adapt to changing environments. We examined conservation priorities by using the inverse of the rate of body mass evolution to account for species with low rates of evolution. In line with previous work, we found high correlations among species richness and PD (r = 0.96), and endemism and PE (r = 0.82) in Malagasy lemurs. Phylogenetic endemism in combination with rates of evolution and their inverse prioritized grid cells containing highly endemic and specialized lemurs at risk of extinction, such as Avahi occidentalis and Lepilemur edwardsi, 2 endangered lemurs with high rates of phenotypic evolution and low-quality diets, and Hapalemur aureus, a critically endangered species with a low rate of body mass evolution and a diet consisting of very high doses of cyanide.

  16. Multifractal Resilience Metrics for Complex Systems?

    NASA Astrophysics Data System (ADS)

    Schertzer, D. J.; Tchiguirinskaia, I.; Lovejoy, S.

    2011-12-01

    The term resilience has become extremely fashionable, especially for complex systems, whereas corresponding operational definitions have remained rather elusive (Carpenter et al. 2001). More precisely, the resilience assessment of man-made systems (from nuclear plants to cities) to geophysical extremes require mathematically defined resilience metrics based on some conceptual definition, e.g. the often cited definition of "ecological resilience" (Hollings 1973): "the capacity of a system to absorb disturbance and reorganize while undergoing change so as to still retain essentially the same function, structure, identity, and feedbacks". Surprisingly, whereas it was acknowledged by Folke et al. (2010) that "multiscale resilience is fundamental for understanding the interplay between persistence and change, adaptability and transformability", the relation between resilience and scaling has not been so much questioned, see however Peterson (2000). We argue that is rather indispensable to go well beyond the attractor approach (Pimm and Lawton 1977; Collings and Wollkind 1990;), as well as extensions (Martin et al., 2011) into the framework of the viability theory (Aubin 1991; Aubin et al. 2011). Indeed, both are rather limited to systems that are complex only in time. Scale symmetries are indeed indispensable to reduce the space-time complexity by defining scale independent observables, which are the singularities of the original, scale dependent fields. These singularities enable to define across-scale resilience, instead of resilience at a given scale.

  17. Fighter agility metrics, research, and test

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.

    1990-01-01

    Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A completed set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation provided by the NASA Dryden Flight Research Center. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available. Simulation documentation and user instructions are provided in an appendix.

  18. A Sensor-Independent Gust Hazard Metric

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2001-01-01

    A procedure for calculating an intuitive hazard metric for gust effects on airplanes is described. The hazard metric is for use by pilots and is intended to replace subjective pilot reports (PIREPs) of the turbulence level. The hazard metric is composed of three numbers: the first describes the average airplane response to the turbulence, the second describes the positive peak airplane response to the gusts, and the third describes the negative peak airplane response to the gusts. The hazard metric is derived from any time history of vertical gust measurements and is thus independent of the sensor making the gust measurements. The metric is demonstrated for one simulated airplane encountering different types of gusts including those derived from flight data recorder measurements of actual accidents. The simulated airplane responses to the gusts compare favorably with the hazard metric.

  19. Differentiation of AmpC beta-lactamase binders vs. decoys using classification kNN QSAR modeling and application of the QSAR classifier to virtual screening

    NASA Astrophysics Data System (ADS)

    Hsieh, Jui-Hua; Wang, Xiang S.; Teotico, Denise; Golbraikh, Alexander; Tropsha, Alexander

    2008-09-01

    The use of inaccurate scoring functions in docking algorithms may result in the selection of compounds with high predicted binding affinity that nevertheless are known experimentally not to bind to the target receptor. Such falsely predicted binders have been termed `binding decoys'. We posed a question as to whether true binders and decoys could be distinguished based only on their structural chemical descriptors using approaches commonly used in ligand based drug design. We have applied the k-Nearest Neighbor ( kNN) classification QSAR approach to a dataset of compounds characterized as binders or binding decoys of AmpC beta-lactamase. Models were subjected to rigorous internal and external validation as part of our standard workflow and a special QSAR modeling scheme was employed that took into account the imbalanced ratio of inhibitors to non-binders (1:4) in this dataset. 342 predictive models were obtained with correct classification rate (CCR) for both training and test sets as high as 0.90 or higher. The prediction accuracy was as high as 100% (CCR = 1.00) for the external validation set composed of 10 compounds (5 true binders and 5 decoys) selected randomly from the original dataset. For an additional external set of 50 known non-binders, we have achieved the CCR of 0.87 using very conservative model applicability domain threshold. The validated binary kNN QSAR models were further employed for mining the NCGC AmpC screening dataset (69653 compounds). The consensus prediction of 64 compounds identified as screening hits in the AmpC PubChem assay disagreed with their annotation in PubChem but was in agreement with the results of secondary assays. At the same time, 15 compounds were identified as potential binders contrary to their annotation in PubChem. Five of them were tested experimentally and showed inhibitory activities in millimolar range with the highest binding constant Ki of 135 μM. Our studies suggest that validated QSAR models could complement

  20. Common Metrics for Human-Robot Interaction

    NASA Technical Reports Server (NTRS)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  1. Is publication rate an equal opportunity metric?

    PubMed

    Cameron, Elissa Z; Gray, Meeghan E; White, Angela M

    2013-01-01

    Publication quantity is frequently used as a ranking metric for employment, promotion, and grant success, and is considered an unbiased metric for comparing applicants. However, research suggests that women publish fewer papers, such that the measure may not be equitable. We suggest reasons for the disparity, and potential future remedies. Publication quality and impact provide more equitable metrics of research performance and should be stressed above publication quantity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Metrics for antibody therapeutics development.

    PubMed

    Reichert, Janice M

    2010-01-01

    A wide variety of full-size monoclonal antibodies (mAbs) and therapeutics derived from alternative antibody formats can be produced through genetic and biological engineering techniques. These molecules are now filling the preclinical and clinical pipelines of every major pharmaceutical company and many biotechnology firms. Metrics for the development of antibody therapeutics, including averages for the number of candidates entering clinical study and development phase lengths for mAbs approved in the United States, were derived from analysis of a dataset of over 600 therapeutic mAbs that entered clinical study sponsored, at least in part, by commercial firms. The results presented provide an overview of the field and context for the evaluation of on-going and prospective mAb development programs. The expansion of therapeutic antibody use through supplemental marketing approvals and the increase in the study of therapeutics derived from alternative antibody formats are discussed.

  3. A Metric Conceptual Space Algebra

    NASA Astrophysics Data System (ADS)

    Adams, Benjamin; Raubal, Martin

    The modeling of concepts from a cognitive perspective is important for designing spatial information systems that interoperate with human users. Concept representations that are built using geometric and topological conceptual space structures are well suited for semantic similarity and concept combination operations. In addition, concepts that are more closely grounded in the physical world, such as many spatial concepts, have a natural fit with the geometric structure of conceptual spaces. Despite these apparent advantages, conceptual spaces are underutilized because existing formalizations of conceptual space theory have focused on individual aspects of the theory rather than the creation of a comprehensive algebra. In this paper we present a metric conceptual space algebra that is designed to facilitate the creation of conceptual space knowledge bases and inferencing systems. Conceptual regions are represented as convex polytopes and context is built in as a fundamental element. We demonstrate the applicability of the algebra to spatial information systems with a proof-of-concept application.

  4. Performance comparison of video quality metrics

    NASA Astrophysics Data System (ADS)

    Zoran, Kotevski; Pece, Mitrevski

    2010-02-01

    The development of digital video technology, due to its nature, introduced new approach to the objective video quality estimation. Basically there are two types of metrics for measuring the quality of digital video: purely mathematically defined video quality metrics (DELTA, MSAD, MSE, SNR and PSNR) where the error is mathematically calculated as a difference between the original and processed pixel, and video quality metrics that have similar characteristics as the Human Visual System (SSIM, NQI, VQM), where the perceptual quality is also considered in the overall quality estimation. The metrics from the first group are more technical ones and because the visual quality of perception is more complex than pixel error calculation, many examples show that their video quality estimation is deficiently accurate. The second group of metrics work in a different manner compared to previous, calculating the scene structure in the overall video quality estimation. This paper is concerned with experimental comparison of the performance of Structural Similarity (SSIM) and Video Quality Metric (VQM) metrics for objective video quality estimation. For the purpose of this experiment, more than 300 short video sequences were prepared. The measurements of these video sequences are used to draw the metrics dependence to common changes in processed video sequences. These changes include changes in: brightness, contrast, hue, saturation and noise. This paper pinpoints the key characteristics of each metric, gives the conclusion of the better performing one and gives directions for improvement of objective video quality estimation.

  5. Semantic Metrics for Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Etzkorn, Lethe

    2003-01-01

    The purpose of this proposal is to research a new suite of object-oriented (OO) software metrics, called semantic metrics, that have the potential to help software engineers identify fragile, low quality code sections much earlier in the development cycle than is possible with traditional OO metrics. With earlier and better Fault detection, software maintenance will be less time consuming and expensive, and software reusability will be improved. Because it is less costly to correct faults found earlier than to correct faults found later in the software lifecycle, the overall cost of software development will be reduced. Semantic metrics can be derived from the knowledge base of a program understanding system. A program understanding system is designed to understand a software module. Once understanding is complete, the knowledge-base contains digested information about the software module. Various semantic metrics can be collected on the knowledge base. This new kind of metric measures domain complexity, or the relationship of the software to its application domain, rather than implementation complexity, which is what traditional software metrics measure. A semantic metric will thus map much more closely to qualities humans are interested in, such as cohesion and maintainability, than is possible using traditional metrics, that are calculated using only syntactic aspects of software.

  6. Bounds for phylogenetic network space metrics.

    PubMed

    Francis, Andrew; Huber, Katharina T; Moulton, Vincent; Wu, Taoyang

    2017-08-23

    Phylogenetic networks are a generalization of phylogenetic trees that allow for representation of reticulate evolution. Recently, a space of unrooted phylogenetic networks was introduced, where such a network is a connected graph in which every vertex has degree 1 or 3 and whose leaf-set is a fixed set X of taxa. This space, denoted [Formula: see text], is defined in terms of two operations on networks-the nearest neighbor interchange and triangle operations-which can be used to transform any network with leaf set X into any other network with that leaf set. In particular, it gives rise to a metric d on [Formula: see text] which is given by the smallest number of operations required to transform one network in [Formula: see text] into another in [Formula: see text]. The metric generalizes the well-known NNI-metric on phylogenetic trees which has been intensively studied in the literature. In this paper, we derive a bound for the metric d as well as a related metric [Formula: see text] which arises when restricting d to the subset of [Formula: see text] consisting of all networks with [Formula: see text] vertices, [Formula: see text]. We also introduce two new metrics on networks-the SPR and TBR metrics-which generalize the metrics on phylogenetic trees with the same name and give bounds for these new metrics. We expect our results to eventually have applications to the development and understanding of network search algorithms.

  7. Fusion metrics for dynamic situation analysis

    NASA Astrophysics Data System (ADS)

    Blasch, Erik P.; Pribilski, Mike; Daughtery, Bryan; Roscoe, Brian; Gunsett, Josh

    2004-08-01

    To design information fusion systems, it is important to develop metrics as part of a test and evaluation strategy. In many cases, fusion systems are designed to (1) meet a specific set of user information needs (IN), (2) continuously validate information pedigree and updates, and (3) maintain this performance under changing conditions. A fusion system"s performance is evaluated in many ways. However, developing a consistent set of metrics is important for standardization. For example, many track and identification metrics have been proposed for fusion analysis. To evaluate a complete fusion system performance, level 4 sensor management and level 5 user refinement metrics need to be developed simultaneously to determine whether or not the fusion system is meeting information needs. To describe fusion performance, the fusion community needs to agree on a minimum set of metrics for user assessment and algorithm comparison. We suggest that such a minimum set should include feasible metrics of accuracy, confidence, throughput, timeliness, and cost. These metrics can be computed as confidence (probability), accuracy (error), timeliness (delay), throughput (amount) and cost (dollars). In this paper, we explore an aggregate set of metrics for fusion evaluation and demonstrate with information need metrics for dynamic situation analysis.

  8. Semantic Metrics for Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Etzkorn, Lethe

    2003-01-01

    The purpose of this proposal is to research a new suite of object-oriented (OO) software metrics, called semantic metrics, that have the potential to help software engineers identify fragile, low quality code sections much earlier in the development cycle than is possible with traditional OO metrics. With earlier and better Fault detection, software maintenance will be less time consuming and expensive, and software reusability will be improved. Because it is less costly to correct faults found earlier than to correct faults found later in the software lifecycle, the overall cost of software development will be reduced. Semantic metrics can be derived from the knowledge base of a program understanding system. A program understanding system is designed to understand a software module. Once understanding is complete, the knowledge-base contains digested information about the software module. Various semantic metrics can be collected on the knowledge base. This new kind of metric measures domain complexity, or the relationship of the software to its application domain, rather than implementation complexity, which is what traditional software metrics measure. A semantic metric will thus map much more closely to qualities humans are interested in, such as cohesion and maintainability, than is possible using traditional metrics, that are calculated using only syntactic aspects of software.

  9. Similarity metrics for surgical process models.

    PubMed

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (p<0.001) with the induced modifications and that all metrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Bimodal spectroscopic evaluation of ultra violet-irradiated mouse skin inflammatory and precancerous stages: instrumentation, spectral feature extraction/selection and classification (k-NN, LDA and SVM)

    NASA Astrophysics Data System (ADS)

    Díaz-Ayil, G.; Amouroux, M.; Blondel, W. C. P. M.; Bourg-Heckly, G.; Leroux, A.; Guillemin, F.; Granjon, Y.

    2009-07-01

    This paper deals with the development and application of in vivo spatially-resolved bimodal spectroscopy (AutoFluorescence AF and Diffuse Reflectance DR), to discriminate various stages of skin precancer in a preclinical model (UV-irradiated mouse): Compensatory Hyperplasia CH, Atypical Hyperplasia AH and Dysplasia D. A programmable instrumentation was developed for acquiring AF emission spectra using 7 excitation wavelengths: 360, 368, 390, 400, 410, 420 and 430 nm, and DR spectra in the 390-720 nm wavelength range. After various steps of intensity spectra preprocessing (filtering, spectral correction and intensity normalization), several sets of spectral characteristics were extracted and selected based on their discrimination power statistically tested for every pair-wise comparison of histological classes. Data reduction with Principal Components Analysis (PCA) was performed and 3 classification methods were implemented (k-NN, LDA and SVM), in order to compare diagnostic performance of each method. Diagnostic performance was studied and assessed in terms of sensitivity (Se) and specificity (Sp) as a function of the selected features, of the combinations of 3 different inter-fibers distances and of the numbers of principal components, such that: Se and Sp ≈ 100% when discriminating CH vs. others; Sp ≈ 100% and Se > 95% when discriminating Healthy vs. AH or D; Sp ≈ 74% and Se ≈ 63%for AH vs. D.

  11. Dielectric and ferroelectric properties of strain-relieved epitaxial lead-free KNN-LT-LS ferroelectric thin films on SrTiO3 substrates

    NASA Astrophysics Data System (ADS)

    Abazari, M.; Akdoǧan, E. K.; Safari, A.

    2008-05-01

    We report the growth of single-phase (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.10,Sb0.06)O3 thin films on SrRuO3 coated ⟨001⟩ oriented SrTiO3 substrates by using pulsed laser deposition. Films grown at 600°C under low laser fluence exhibit a ⟨001⟩ textured columnar grained nanostructure, which coalesce with increasing deposition temperature, leading to a uniform fully epitaxial highly stoichiometric film at 750°C. However, films deposited at lower temperatures exhibit compositional fluctuations as verified by Rutherford backscattering spectroscopy. The epitaxial films of 400-600nm thickness have a room temperature relative permittivity of ˜750 and a loss tangent of ˜6% at 1kHz. The room temperature remnant polarization of the films is 4μC /cm2, while the saturation polarization is 7.1μC/cm2 at 24kV/cm and the coercive field is ˜7.3kV/cm. The results indicate that approximately 50% of the bulk permittivity and 20% of bulk spontaneous polarization can be retained in submicron epitaxial KNN-LT-LS thin film, respectively. The conductivity of the films remains to be a challenge as evidenced by the high loss tangent, leakage currents, and broad hysteresis loops.

  12. Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion about Scholarly Metrics

    ERIC Educational Resources Information Center

    DeSanto, Dan; Nichols, Aaron

    2017-01-01

    This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014-2015. The survey asked faculty about: familiarity with scholarly metrics, metric-seeking habits, help-seeking habits, and the role of metrics in their department's tenure and promotion process. The survey also gathered faculty…

  13. Elementary Metric Curriculum - Project T.I.M.E. (Timely Implementation of Metric Education). Part I.

    ERIC Educational Resources Information Center

    Community School District 18, Brooklyn, NY.

    This is a teacher's manual for an ISS-based elementary school course in the metric system. Behavioral objectives and student activities are included. The topics covered include: (1) linear measurement; (2) metric-decimal relationships; (3) metric conversions; (4) geometry; (5) scale drawings; and (6) capacity. This is the first of a two-part…

  14. Elementary Metric Curriculum - Project T.I.M.E. (Timely Implementation of Metric Education). Part I.

    ERIC Educational Resources Information Center

    Community School District 18, Brooklyn, NY.

    This is a teacher's manual for an ISS-based elementary school course in the metric system. Behavioral objectives and student activities are included. The topics covered include: (1) linear measurement; (2) metric-decimal relationships; (3) metric conversions; (4) geometry; (5) scale drawings; and (6) capacity. This is the first of a two-part…

  15. Imbedding Locally Euclidean and Conformally Euclidean Metrics

    NASA Astrophysics Data System (ADS)

    Aleksandrov, V. A.

    1992-02-01

    The possibility of imbedding n-dimensional locally Euclidean metrics in the large in Rn is studied by means of the global inverse function theorem in the forms suggested by Hadamard, John, Levy and Plastock. The imbeddability of conformally Euclidean metrics is studied by means of a theorem of Zorich on the removability of an isolated singularity of a locally quasiconformal mapping.

  16. Sensitivity of landscape metrics to pixel size

    Treesearch

    J. D. Wickham; K. H. Riitters

    1995-01-01

    Analysis of diversity and evenness metrics using land cover data are becoming formalized in landscape ecology. Diversity and evenness metrics are dependent on the pixel size (scale) over which the data are collected. Aerial photography was interpreted for land cover and converted into four raster data sets with 4, 12, 28, and 80 m pixel sizes, representing pixel sizes...

  17. A Complexity Metric for Automated Separation

    NASA Technical Reports Server (NTRS)

    Aweiss, Arwa

    2009-01-01

    A metric is proposed to characterize airspace complexity with respect to an automated separation assurance function. The Maneuver Option metric is a function of the number of conflict-free trajectory change options the automated separation assurance function is able to identify for each aircraft in the airspace at a given time. By aggregating the metric for all aircraft in a region of airspace, a measure of the instantaneous complexity of the airspace is produced. A six-hour simulation of Fort Worth Center air traffic was conducted to assess the metric. Results showed aircraft were twice as likely to be constrained in the vertical dimension than the horizontal one. By application of this metric, situations found to be most complex were those where level overflights and descending arrivals passed through or merged into an arrival stream. The metric identified high complexity regions that correlate well with current air traffic control operations. The Maneuver Option metric did not correlate with traffic count alone, a result consistent with complexity metrics for human-controlled airspace.

  18. Advanced Life Support System Value Metric

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Rasky, Daniel J. (Technical Monitor)

    1999-01-01

    The NASA Advanced Life Support (ALS) Program is required to provide a performance metric to measure its progress in system development. Extensive discussions within the ALS program have led to the following approach. The Equivalent System Mass (ESM) metric has been traditionally used and provides a good summary of the weight, size, and power cost factors of space life support equipment. But ESM assumes that all the systems being traded off exactly meet a fixed performance requirement, so that the value and benefit (readiness, performance, safety, etc.) of all the different systems designs are considered to be exactly equal. This is too simplistic. Actual system design concepts are selected using many cost and benefit factors and the system specification is defined after many trade-offs. The ALS program needs a multi-parameter metric including both the ESM and a System Value Metric (SVM). The SVM would include safety, maintainability, reliability, performance, use of cross cutting technology, and commercialization potential. Another major factor in system selection is technology readiness level (TRL), a familiar metric in ALS. The overall ALS system metric that is suggested is a benefit/cost ratio, SVM/[ESM + function (TRL)], with appropriate weighting and scaling. The total value is given by SVM. Cost is represented by higher ESM and lower TRL. The paper provides a detailed description and example application of a suggested System Value Metric and an overall ALS system metric.

  19. Particle dynamics in the original Schwarzschild metric

    NASA Astrophysics Data System (ADS)

    Fimin, N. N.; Chechetkin, V. M.

    2016-04-01

    The properties of the original Schwarzschild metric for a point gravitating mass are considered. The laws of motion in the corresponding space-time are established, and the transition from the Schwarzschildmetric to the metric of a "dusty universe" are studied. The dynamics of a system of particles in thr post-Newtonian approximation are analyzed.

  20. Metric Measurement Activity Cards, Monograph No. 4.

    ERIC Educational Resources Information Center

    Bidwell, James K., Ed.

    This document introduces the metric measurement system to students in the elementary grades through ready-to-use activity cards covering metric concepts of length, area, and volume. The cards provide a minimal sequence of activities aimed at helping the student become familar with each measurement concept; cards are ungraded and can be used as…

  1. Metrics for Automotive Merchandising, Petroleum Marketing.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of students in automotive merchandising and petroleum marketing classes, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know…

  2. Handbook for Metric Usage (First Edition).

    ERIC Educational Resources Information Center

    American Home Economics Association, Washington, DC.

    Guidelines for changing to the metric system of measurement with regard to all phases of home economics are presented in this handbook. Topics covered include the following: (1) history of the metric system, (2) the International System of Units (SI): derived units of length, mass, time, and electric current; temperature; luminous intensity;…

  3. Advanced Life Support System Value Metric

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Rasky, Daniel J. (Technical Monitor)

    1999-01-01

    The NASA Advanced Life Support (ALS) Program is required to provide a performance metric to measure its progress in system development. Extensive discussions within the ALS program have led to the following approach. The Equivalent System Mass (ESM) metric has been traditionally used and provides a good summary of the weight, size, and power cost factors of space life support equipment. But ESM assumes that all the systems being traded off exactly meet a fixed performance requirement, so that the value and benefit (readiness, performance, safety, etc.) of all the different systems designs are considered to be exactly equal. This is too simplistic. Actual system design concepts are selected using many cost and benefit factors and the system specification is defined after many trade-offs. The ALS program needs a multi-parameter metric including both the ESM and a System Value Metric (SVM). The SVM would include safety, maintainability, reliability, performance, use of cross cutting technology, and commercialization potential. Another major factor in system selection is technology readiness level (TRL), a familiar metric in ALS. The overall ALS system metric that is suggested is a benefit/cost ratio, SVM/[ESM + function (TRL)], with appropriate weighting and scaling. The total value is given by SVM. Cost is represented by higher ESM and lower TRL. The paper provides a detailed description and example application of a suggested System Value Metric and an overall ALS system metric.

  4. Handbook for Metric Usage (First Edition).

    ERIC Educational Resources Information Center

    American Home Economics Association, Washington, DC.

    Guidelines for changing to the metric system of measurement with regard to all phases of home economics are presented in this handbook. Topics covered include the following: (1) history of the metric system, (2) the International System of Units (SI): derived units of length, mass, time, and electric current; temperature; luminous intensity;…

  5. Metrics for Offset Printing Press Operation.

    ERIC Educational Resources Information Center

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of offset printing press operation students, this instructional package is one of six for the communication media occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…

  6. Smart Grid Status and Metrics Report Appendices

    SciTech Connect

    Balducci, Patrick J.; Antonopoulos, Chrissi A.; Clements, Samuel L.; Gorrissen, Willy J.; Kirkham, Harold; Ruiz, Kathleen A.; Smith, David L.; Weimar, Mark R.; Gardner, Chris; Varney, Jeff

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  7. Program for implementing software quality metrics

    SciTech Connect

    Yule, H.P.; Riemer, C.A.

    1992-04-01

    This report describes a program by which the Veterans Benefit Administration (VBA) can implement metrics to measure the performance of automated data systems and demonstrate that they are improving over time. It provides a definition of quality, particularly with regard to software. Requirements for management and staff to achieve a successful metrics program are discussed. It lists the attributes of high-quality software, then describes the metrics or calculations that can be used to measure these attributes in a particular system. Case studies of some successful metrics programs used by business are presented. The report ends with suggestions on which metrics the VBA should use and the order in which they should be implemented.

  8. Guidelines for metrication at Lawrence Berkeley Laboratory

    SciTech Connect

    Not Available

    1993-07-01

    This document provides a set of guidelines for the metric transition process already under way at Lawrence Berkeley Laboratory. LBL has embarked upon this course in response to Section 5164 of the Trade and Competitiveness Act of 1988, Executive Order 12770 of 1991, and DOE Order 5900.2. The core provision of DOE Order 5900.2 is Section 7b, which states: {open_quotes}Metric usage shall be required except to the extent that such use is impractical, or is likely to cause significant inefficiencies to, or loss of markets by United States firms, or an inability of the Department to fulfill its responsibilities under the laws of the Federal Government and the United States.{close_quotes} LBL`s metrication policy is meant to comply with this requirement by aggressively fostering metrication. The purpose of these guidelines is to optimize the coherence and the cost-effectiveness of the metrication process.

  9. Metrics for border management systems.

    SciTech Connect

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  10. String Rearrangement Metrics: A Survey

    NASA Astrophysics Data System (ADS)

    Amir, Amihood; Levy, Avivit

    A basic assumption in traditional pattern matching is that the order of the elements in the given input strings is correct, while the description of the content, i.e. the description of the elements, may be erroneous. Motivated by questions that arise in Text Editing, Computational Biology, Bit Torrent and Video on Demand, and Computer Architecture, a new pattern matching paradigm was recently proposed by [2]. In this model, the pattern content remains intact, but the relative positions may change. Several papers followed the initial definition of the new paradigm. Each paper revealed new aspects in the world of string rearrangement metrics. This new unified view has already proven itself by enabling the solution of an open problem of the mathematician Cayley from 1849. It also gave better insight to problems that were already studied in different and limited situations, such as the behavior of different cost functions, and enabled deriving results for cost functions that were not yet sufficiently analyzed by previous research. At this stage, a general understanding of this new model is beginning to coalesce. The aim of this survey is to present an overview of this recent new direction of research, the problems, the methodologies, and the state-of-the-art.

  11. Terrain analysis from visibility metrics

    NASA Astrophysics Data System (ADS)

    Richbourg, Robert F.; Ray, Clark; Campbell, Larry L.

    1995-07-01

    Terrain analysis in support of planned military training or operations in a task which requires considerably training, skill, and experience. Military planners must synthesize knowledge of both their own and their expected adversary's tactics, weapons systems, and probable courses of action to determine key terrain, those portions of the terrain surface which have the most impact on the conduct of tactical operations. Many attributes of the actual terrain influence terrain analyses. These include elevation, intervisibility, vegetation cover, transportation networks, waterways, trafficability, soil types, and others. In some important areas of the world, the large set of attributes that influence terrain analysis is greatly reduced. Desert areas comprise one such areal class. As an example, a high resolution digital elevation model is sufficient to support most terrain analysis efforts for platoon and company operations in the US Marine Corps' dismounted infantry training area at 29 Palms, California. The digital elevation model allows an analysis to characterize each point in the model according to an approximate relative-visibility metric. Determination of key terrain, siting of probable defensive positions, and identification of highly concealed avenues of approach flow from examination of the resulting visibility model. These tactically significant areas can be used to conduct operations planning, perform DEM resolution studies, or help determine selective fidelity parameters for TIN modeling purposes.

  12. Metrics and Benchmarks for Visualization

    NASA Technical Reports Server (NTRS)

    Uselton, Samuel P.; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    What is a "good" visualization? How can the quality of a visualization be measured? How can one tell whether one visualization is "better" than another? I claim that the true quality of a visualization can only be measured in the context of a particular purpose. The same image generated from the same data may be excellent for one purpose and abysmal for another. A good measure of visualization quality will correspond to the performance of users in accomplishing the intended purpose, so the "gold standard" is user testing. As a user of visualization software (or at least a consultant to such users) I don't expect visualization software to have been tested in this way for every possible use. In fact, scientific visualization (as distinct from more "production oriented" uses of visualization) will continually encounter new data, new questions and new purposes; user testing can never keep up. User need software they can trust, and advice on appropriate visualizations of particular purposes. Considering the following four processes, and their impact on visualization trustworthiness, reveals important work needed to create worthwhile metrics and benchmarks for visualization. These four processes are (1) complete system testing (user-in-loop), (2) software testing, (3) software design and (4) information dissemination. Additional information is contained in the original extended abstract.

  13. Bounded Linear Stability Margin Analysis of Nonlinear Hybrid Adaptive Control

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Boskovic, Jovan D.

    2008-01-01

    This paper presents a bounded linear stability analysis for a hybrid adaptive control that blends both direct and indirect adaptive control. Stability and convergence of nonlinear adaptive control are analyzed using an approximate linear equivalent system. A stability margin analysis shows that a large adaptive gain can lead to a reduced phase margin. This method can enable metrics-driven adaptive control whereby the adaptive gain is adjusted to meet stability margin requirements.

  14. Partial rectangular metric spaces and fixed point theorems.

    PubMed

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  15. Foresters' Metric Conversions program (version 1.0). [Computer program

    Treesearch

    Jefferson A. Palmer

    1999-01-01

    The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...

  16. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  17. Launch Vehicle Production and Operations Cost Metrics

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Neeley, James R.; Blackburn, Ruby F.

    2014-01-01

    Traditionally, launch vehicle cost has been evaluated based on $/Kg to orbit. This metric is calculated based on assumptions not typically met by a specific mission. These assumptions include the specified orbit whether Low Earth Orbit (LEO), Geostationary Earth Orbit (GEO), or both. The metric also assumes the payload utilizes the full lift mass of the launch vehicle, which is rarely true even with secondary payloads.1,2,3 Other approaches for cost metrics have been evaluated including unit cost of the launch vehicle and an approach to consider the full program production and operations costs.4 Unit cost considers the variable cost of the vehicle and the definition of variable costs are discussed. The full program production and operation costs include both the variable costs and the manufacturing base. This metric also distinguishes operations costs from production costs, including pre-flight operational testing. Operations costs also consider the costs of flight operations, including control center operation and maintenance. Each of these 3 cost metrics show different sensitivities to various aspects of launch vehicle cost drivers. The comparison of these metrics provides the strengths and weaknesses of each yielding an assessment useful for cost metric selection for launch vehicle programs.

  18. Altmetrics – a complement to conventional metrics

    PubMed Central

    Melero, Remedios

    2015-01-01

    Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Article-level metrics (ALM) is the result of the aggregation of different data sources and the collection of content from multiple social network services. Sources used for the aggregation can be broken down into five categories: usage, captures, mentions, social media and citations. Data sources depend on the tool, but they include classic metrics indicators based on citations, academic social networks (Mendeley, CiteULike, Delicious) and social media (Facebook, Twitter, blogs, or Youtube, among others). Altmetrics is not synonymous with alternative metrics. Altmetrics are normally early available and allow to assess the social impact of scholarly outputs, almost at the real time. This paper overviews briefly the meaning of altmetrics and describes some of the existing tools used to apply this new metrics: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum. PMID:26110028

  19. Advanced Life Support System Value Metric

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Arnold, James O. (Technical Monitor)

    1999-01-01

    The NASA Advanced Life Support (ALS) Program is required to provide a performance metric to measure its progress in system development. Extensive discussions within the ALS program have reached a consensus. The Equivalent System Mass (ESM) metric has been traditionally used and provides a good summary of the weight, size, and power cost factors of space life support equipment. But ESM assumes that all the systems being traded off exactly meet a fixed performance requirement, so that the value and benefit (readiness, performance, safety, etc.) of all the different systems designs are exactly equal. This is too simplistic. Actual system design concepts are selected using many cost and benefit factors and the system specification is then set accordingly. The ALS program needs a multi-parameter metric including both the ESM and a System Value Metric (SVM). The SVM would include safety, maintainability, reliability, performance, use of cross cutting technology, and commercialization potential. Another major factor in system selection is technology readiness level (TRL), a familiar metric in ALS. The overall ALS system metric that is suggested is a benefit/cost ratio, [SVM + TRL]/ESM, with appropriate weighting and scaling. The total value is the sum of SVM and TRL. Cost is represented by ESM. The paper provides a detailed description and example application of the suggested System Value Metric.

  20. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  1. Altmetrics - a complement to conventional metrics.

    PubMed

    Melero, Remedios

    2015-01-01

    Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Article-level metrics (ALM) is the result of the aggregation of different data sources and the collection of content from multiple social network services. Sources used for the aggregation can be broken down into five categories: usage, captures, mentions, social media and citations. Data sources depend on the tool, but they include classic metrics indicators based on citations, academic social networks (Mendeley, CiteULike, Delicious) and social media (Facebook, Twitter, blogs, or Youtube, among others). Altmetrics is not synonymous with alternative metrics. Altmetrics are normally early available and allow to assess the social impact of scholarly outputs, almost at the real time. This paper overviews briefly the meaning of altmetrics and describes some of the existing tools used to apply this new metrics: Public Library of Science--Article-Level Metrics, Altmetric, Impactstory and Plum.

  2. Advanced Life Support System Value Metric

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Arnold, James O. (Technical Monitor)

    1999-01-01

    The NASA Advanced Life Support (ALS) Program is required to provide a performance metric to measure its progress in system development. Extensive discussions within the ALS program have reached a consensus. The Equivalent System Mass (ESM) metric has been traditionally used and provides a good summary of the weight, size, and power cost factors of space life support equipment. But ESM assumes that all the systems being traded off exactly meet a fixed performance requirement, so that the value and benefit (readiness, performance, safety, etc.) of all the different systems designs are exactly equal. This is too simplistic. Actual system design concepts are selected using many cost and benefit factors and the system specification is then set accordingly. The ALS program needs a multi-parameter metric including both the ESM and a System Value Metric (SVM). The SVM would include safety, maintainability, reliability, performance, use of cross cutting technology, and commercialization potential. Another major factor in system selection is technology readiness level (TRL), a familiar metric in ALS. The overall ALS system metric that is suggested is a benefit/cost ratio, [SVM + TRL]/ESM, with appropriate weighting and scaling. The total value is the sum of SVM and TRL. Cost is represented by ESM. The paper provides a detailed description and example application of the suggested System Value Metric.

  3. SAPHIRE 8 Quality Assurance Software Metrics Report

    SciTech Connect

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  4. Metrics required for Power System Resilient Operations and Protection

    SciTech Connect

    Eshghi, K.; Johnson, B. K.; Rieger, C. G.

    2016-08-01

    Today’s complex grid involves many interdependent systems. Various layers of hierarchical control and communication systems are coordinated, both spatially and temporally to achieve gird reliability. As new communication network based control system technologies are being deployed, the interconnected nature of these systems is becoming more complex. Deployment of smart grid concepts promises effective integration of renewable resources, especially if combined with energy storage. However, without a philosophical focus on resilience, a smart grid will potentially lead to higher magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially catastrophic event. Future system operations can be enhanced with a resilient philosophy through architecting the complexity with state awareness metrics that recognize changing system conditions and provide for an agile and adaptive response. The starting point for metrics lies in first understanding the attributes of performance that will be qualified. In this paper, we will overview those attributes and describe how they will be characterized by designing a distributed agent that can be applied to the power grid.

  5. Assessment of proposed fighter agility metrics

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.; Downing, David R.

    1990-01-01

    This paper presents the results of an analysis of proposed metrics to assess fighter aircraft agility. A novel framework for classifying these metrics is developed and applied. A set of transient metrics intended to quantify the axial and pitch agility of fighter aircraft is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction method are proposed, and sensitivities to pilot introduced errors during flight testing is investigated. Results indicate that the power onset and power loss parameters are promising candidates for quantifying axial agility, while maximum pitch up and pitch down rates are for quantifying pitch agility.

  6. Inspecting baby Skyrmions with effective metrics

    NASA Astrophysics Data System (ADS)

    Gibbons, G. W.; Goulart, E.

    2014-05-01

    In the present paper we investigate the causal structure of the baby Skyrme model using appropriate geometrical tools. We discuss several features of excitations propagating on top of background solutions and show that the evolution of high frequency waves is governed by a curved effective geometry. Examples are given for which the effective metric describes the interaction between waves and solitonic solutions such as kinks, antikinks, and hedgehogs. In particular, it is shown how violent processes involving the collisions of solitons and antisolitons may induce metrics which are not globally hyperbolic. We argue that it might be illuminating to calculate the effective metric as a diagnostic test for pathological regimes in numerical simulations.

  7. Metrics for comparison of crystallographic maps

    SciTech Connect

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects, such as regions of high density, are of interest.

  8. Mental workload classification using heart rate metrics.

    PubMed

    Henelius, Andreas; Hirvonen, Kati; Holm, Anu; Korpela, Jussi; Muller, Kiti

    2009-01-01

    The ability of different short-term heart rate variability metrics to classify the level of mental workload (MWL) in 140 s segments was studied. Electrocardiographic data and event related potentials (ERPs), calculated from electroencephalographic data, were collected from 13 healthy subjects during the performance of a computerised cognitive multitask test with different task load levels. The amplitude of the P300 component of the ERPs was used as an objective measure of MWL. Receiver operating characteristics analysis (ROC) showed that the time domain metric of average interbeat interval length was the best-performing metric in terms of classification ability.

  9. Applying Sigma Metrics to Reduce Outliers.

    PubMed

    Litten, Joseph

    2017-03-01

    Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Metrics for comparison of crystallographic maps

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-01-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects, such as regions of high density, are of interest. PMID:25286844

  11. Area Minimizing Discs in Metric Spaces

    NASA Astrophysics Data System (ADS)

    Lytchak, Alexander; Wenger, Stefan

    2017-03-01

    We solve the classical problem of Plateau in the setting of proper metric spaces. Precisely, we prove that among all disc-type surfaces with prescribed Jordan boundary in a proper metric space there exists an area minimizing disc which moreover has a quasi-conformal parametrization. If the space supports a local quadratic isoperimetric inequality for curves we prove that such a solution is locally Hölder continuous in the interior and continuous up to the boundary. Our results generalize corresponding results of Douglas Radò and Morrey from the setting of Euclidean space and Riemannian manifolds to that of proper metric spaces.

  12. Kerr metric in Bondi-Sachs form

    SciTech Connect

    Bishop, Nigel T.; Venter, Liebrecht R.

    2006-04-15

    A metric representing the Kerr geometry has been obtained by Pretorius and Israel. We make coordinate transformations on this metric, to bring it into Bondi-Sachs form. We investigate the behavior of the metric near the axis of symmetry and confirm elementary flatness, and we also confirm that it is asymptotic to the Bondi-Sachs form of the Schwarzschild geometry. The results obtained here are needed so that numerical relativity codes based on the characteristic formalism can be applied to a situation that contains a rotating black hole.

  13. Optimized Seizure Detection Algorithm: A Fast Approach for Onset of Epileptic in EEG Signals Using GT Discriminant Analysis and K-NN Classifier

    PubMed Central

    Rezaee, Kh.; Azizi, E.; Haddadnia, J.

    2016-01-01

    Background Epilepsy is a severe disorder of the central nervous system that predisposes the person to recurrent seizures. Fifty million people worldwide suffer from epilepsy; after Alzheimer’s and stroke, it is the third widespread nervous disorder. Objective In this paper, an algorithm to detect the onset of epileptic seizures based on the analysis of brain electrical signals (EEG) has been proposed. 844 hours of EEG were recorded form 23 pediatric patients consecutively with 163 occurrences of seizures. Signals had been collected from Children’s Hospital Boston with a sampling frequency of 256 Hz through 18 channels in order to assess epilepsy surgery. By selecting effective features from seizure and non-seizure signals of each individual and putting them into two categories, the proposed algorithm detects the onset of seizures quickly and with high sensitivity. Method In this algorithm, L-sec epochs of signals are displayed in form of a third-order tensor in spatial, spectral and temporal spaces by applying wavelet transform. Then, after applying general tensor discriminant analysis (GTDA) on tensors and calculating mapping matrix, feature vectors are extracted. GTDA increases the sensitivity of the algorithm by storing data without deleting them. Finally, K-Nearest neighbors (KNN) is used to classify the selected features. Results The results of simulating algorithm on algorithm standard dataset shows that the algorithm is capable of detecting 98 percent of seizures with an average delay of 4.7 seconds and the average error rate detection of three errors in 24 hours. Conclusion Today, the lack of an automated system to detect or predict the seizure onset is strongly felt. PMID:27672628

  14. Is it possible to predict long-term success with k-NN? Case study of four market indices (FTSE100, DAX, HANGSENG, NASDAQ)

    NASA Astrophysics Data System (ADS)

    Shi, Y.; Gorban, A. N.; Y Yang, T.

    2014-03-01

    This case study tests the possibility of prediction for 'success' (or 'winner') components of four stock & shares market indices in a time period of three years from 02-Jul-2009 to 29-Jun-2012.We compare their performance ain two time frames: initial frame three months at the beginning (02/06/2009-30/09/2009) and the final three month frame (02/04/2012-29/06/2012).To label the components, average price ratio between two time frames in descending order is computed. The average price ratio is defined as the ratio between the mean prices of the beginning and final time period. The 'winner' components are referred to the top one third of total components in the same order as average price ratio it means the mean price of final time period is relatively higher than the beginning time period. The 'loser' components are referred to the last one third of total components in the same order as they have higher mean prices of beginning time period. We analyse, is there any information about the winner-looser separation in the initial fragments of the daily closing prices log-returns time series.The Leave-One-Out Cross-Validation with k-NN algorithm is applied on the daily log-return of components using a distance and proximity in the experiment. By looking at the error analysis, it shows that for HANGSENG and DAX index, there are clear signs of possibility to evaluate the probability of long-term success. The correlation distance matrix histograms and 2-D/3-D elastic maps generated from ViDaExpert show that the 'winner' components are closer to each other and 'winner'/'loser' components are separable on elastic maps for HANGSENG and DAX index while for the negative possibility indices, there is no sign of separation.

  15. Use of KNN technique to improve the efficiency of SCE-UA optimisation method applied to the calibration of HBV Rainfall-Runoff model

    NASA Astrophysics Data System (ADS)

    Dakhlaoui, H.; Bargaoui, Z.

    2007-12-01

    The Calibration of Rainfall-Runoff models can be viewed as an optimisation problem involving an objective function that measures the model performance expressed as a distance between observed and calculated discharges. Effectiveness (ability to find the optimum) and efficiency (cost expressed in number of objective function evaluations to reach the optimum) are the main criteria of choose of the optimisation method. SCE-UA is known as one of the most effective and efficient optimisation method. In this work we tried to improve the SCE-UA efficiency, in the case of the calibration of HBV model by using KNN technique to estimate the objective function. In fact after a number of iterations by SCE-UA, when objective function is evaluated by model simulation, a data base of parameter explored and respective objective function values is constituted. Within this data base it is proposed to estimate the objective function in further iterations, by an interpolation using nearest neighbours in a normalised parameter space with weighted Euclidean distance. Weights are chosen proportional to the sensitivity of parameter to objective function that gives more importance to sensitive parameter. Evaluation of model output is done through the objective function RV=R2- w |RD| where R2 is Nash Sutcliffe coefficient related to discharges, w : a weight and RD the relative bias. Applied to theoretical and practical cases in several catchments under different climatic conditions : Rottweil (Germany) and Tessa, Barbra, and Sejnane (Tunisia), the hybrid SCE-UA presents efficiency better then that of initial SCE-UA by about 20 to 30 %. By using other techniques as parameter space transformation and SCE-UA modification (2), we may obtain an algorithm two to three times faster. (1) Avi Ostfeld, Shani Salomons, "A hybrid genetic-instance learning algorithm for CE*QAL-W2 calibration", Journal of Hydrology 310 (2005) 122-125 (2) Nitin Mutil and Shie-Yui Liong, "Improved robustness and Efficiency

  16. On general (α,β)-metrics of Landsberg type

    NASA Astrophysics Data System (ADS)

    Zohrehvand, M.; Maleki, H.

    2016-05-01

    In this paper, we study a class of Finsler metrics, which are defined by a Riemannian metric α and a one-form β. They are called general (α,β)-metrics. We have proven that, every Landsberg general (α,β)-metric is a Berwald metric, under a certain condition. This shows that the hunting for an unicorn, one of the longest standing open problem in Finsler geometry, cannot be successful in the class of general (α,β)-metrics.

  17. MPLS/VPN traffic engineering: SLA metrics

    NASA Astrophysics Data System (ADS)

    Cherkaoui, Omar; MacGibbon, Brenda; Blais, Michel; Serhrouchni, Ahmed

    2001-07-01

    Traffic engineering must be concerned with a broad definition of service that includes network availability, reliability and stability, as well as traditional traffic data on loss, throughput, delay and jitter. MPLS and Virtual Private Networks (VPNs) significantly contribute to security and Quality of Service (QoS) within communication networks, but there remains a need for metric measurement and evaluation. The purpose of this paper is to propose a methodology which gives a measure for LSP ( Lfew abel Switching Paths) metrics in VPN MPLS networks. We propose here a statistical method for the evaluation of those metrics. Statistical methodology is very important in this type of study since there is a large amount of data to consider. We use the notions of sample surveys, self-similar processes, linear regression, additive models and bootstrapping. The results obtained allows us to estimate the different metrics for such SLAs.

  18. Building a Metric City or Town.

    ERIC Educational Resources Information Center

    Alec, Rudi

    1980-01-01

    A metric measurement activity, construction of a portion of a "town" from paper, that provides concrete, semiconcrete, and abstract learning tasks and experiences, is described. Modifications of the activity for primary and intermediate students are suggested. (MK)

  19. Clean Cities Annual Metrics Report 2009 (Revised)

    SciTech Connect

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  20. A metric to search for relevant words

    NASA Astrophysics Data System (ADS)

    Zhou, Hongding; Slater, Gary W.

    2003-11-01

    We propose a new metric to evaluate and rank the relevance of words in a text. The method uses the density fluctuations of a word to compute an index that measures its degree of clustering. Highly significant words tend to form clusters, while common words are essentially uniformly spread in a text. If a word is not rare, the metric is stable when we move any individual occurrence of this word in the text. Furthermore, we prove that the metric always increases when words are moved to form larger clusters, or when several independent documents are merged. Using the Holy Bible as an example, we show that our approach reduces the significance of common words when compared to a recently proposed statistical metric.

  1. Green Power Partnership Program Success Metrics

    EPA Pesticide Factsheets

    The U.S. EPA's Green Power Partnership is a voluntary program designed to reduce the environmental impact of electricity generation by promoting renewable energy. EPA evaluates partnership metrics annually to determine progress toward programmatic goals.

  2. Meaning Metrics: Measure, Mix, Manipulate, and Mold.

    ERIC Educational Resources Information Center

    Shaw, Jean M.

    1981-01-01

    Many learning activities designed to make metric measurement meaningful to children are described. The manufacture and use of play dough is featured. Several simple recipes for preparing dough are included. (MP)

  3. Meaning Metrics: Measure, Mix, Manipulate, and Mold.

    ERIC Educational Resources Information Center

    Shaw, Jean M.

    1981-01-01

    Many learning activities designed to make metric measurement meaningful to children are described. The manufacture and use of play dough is featured. Several simple recipes for preparing dough are included. (MP)

  4. Effectively nonlocal metric-affine gravity

    NASA Astrophysics Data System (ADS)

    Golovnev, Alexey; Koivisto, Tomi; Sandstad, Marit

    2016-03-01

    In metric-affine theories of gravity such as the C-theories, the spacetime connection is associated to a metric that is nontrivially related to the physical metric. In this article, such theories are rewritten in terms of a single metric, and it is shown that they can be recast as effectively nonlocal gravity. With some assumptions, known ghost-free theories with nonsingular and cosmologically interesting properties may be recovered. Relations between different formulations are analyzed at both perturbative and nonperturbative levels, taking carefully into account subtleties with boundary conditions in the presence of integral operators in the action, and equivalences between theories related by nonlocal redefinitions of the fields are verified at the level of equations of motion. This suggests a possible geometrical interpretation of nonlocal gravity as an emergent property of non-Riemannian spacetime structure.

  5. Do metrical accents create illusory phenomenal accents?

    PubMed

    Repp, Bruno H

    2010-07-01

    In music that is perceived as metrically structured, events coinciding with the main beat are called metrically accented. Are these accents purely cognitive, or do they perhaps represent illusory increases in perceived loudness or duration, caused by heightened attention to main beats? In four separate tasks, musicians tried to detect a small actual increase or decrease in the loudness or duration of a single note in melodies comprising 12 notes. Musical notation prescribed a meter (6/8) implying a main beat coinciding with every third note. Effects of metrical accentuation on detection performance were found in all four tasks. However, they reflected primarily an increase in sensitivity to physical changes in main beat positions, likely to be due to enhanced attention. There was no evidence of biases indicating illusory phenomenal accents in those positions. By contrast, and independent of metrical structure, pitch accents due to pitch contour pivots were often mistaken for increases in loudness.

  6. Invariant metrics, contractions and nonlinear matrix equations

    NASA Astrophysics Data System (ADS)

    Lee, Hosoo; Lim, Yongdo

    2008-04-01

    In this paper we consider the semigroup generated by the self-maps on the open convex cone of positive definite matrices of translations, congruence transformations and matrix inversion that includes symplectic Hamiltonians and show that every member of the semigroup contracts any invariant metric distance inherited from a symmetric gauge function. This extends the results of Bougerol for the Riemannian metric and of Liverani-Wojtkowski for the Thompson part metric. A uniform upper bound of the Lipschitz contraction constant for a member of the semigroup is given in terms of the minimum eigenvalues of its determining matrices. We apply this result to a variety of nonlinear equations including Stein and Riccati equations for uniqueness and existence of positive definite solutions and find a new convergence analysis of iterative algorithms for the positive definite solution depending only on the least contraction coefficient for the invariant metric from the spectral norm.

  7. Yet another family of diagonal metrics for de Sitter and anti-de Sitter spacetimes

    NASA Astrophysics Data System (ADS)

    Podolský, Jiří; Hruška, Ondřej

    2017-06-01

    In this work we present and analyze a new class of coordinate representations of de Sitter and anti-de Sitter spacetimes for which the metrics are diagonal and (typically) static and axially symmetric. Contrary to the well-known forms of these fundamental geometries, that usually correspond to a 1 +3 foliation with the 3-space of a constant spatial curvature, the new metrics are adapted to a 2 +2 foliation, and are warped products of two 2-spaces of constant curvature. This new class of (anti-)de Sitter metrics depends on the value of cosmological constant Λ and two discrete parameters +1 ,0 ,-1 related to the curvature of the 2-spaces. The class admits 3 distinct subcases for Λ >0 and 8 subcases for Λ <0 . We systematically study all these possibilities. In particular, we explicitly present the corresponding parametrizations of the (anti-)de Sitter hyperboloid, visualize the coordinate lines and surfaces within the global conformal cylinder, investigate their mutual relations, present some closely related forms of the metrics, and give transformations to standard de Sitter and anti-de Sitter metrics. Using these results, we also provide a physical interpretation of B -metrics as exact gravitational fields of a tachyon.

  8. Enhancing U.S. Coast Guard Metrics

    DTIC Science & Technology

    2015-01-01

    International Ice Patrol—monitoring icebergs in the North Atlantic, a responsibility engendered by the Titanic disaster—is part of the marine safety...Monitor and report North Atlantic iceberg conditions using fixed-wing aircraft and reports from ships as part of the International Ice Patrol 2...proposed met- rics provides an initial assessment of whether metrics meet the necessary criteria. The resulting metrics should, in theory , meet the necessary

  9. On Kerr-De Sitter Metric

    NASA Astrophysics Data System (ADS)

    Abbassi, Amir H.; Khosravi, Sh.; Abbassi, Amir M.

    We present our derivations for Kerr-de Sitter metric in a proper comoving coordinate system. It asymptotically approaches to the de Sitter metric in Robertson-Walker form. This has been done by considering the stationary axially-symmetric space-time in which motion of particle is integrable. That is the Hamilton-Jacobi and Klein-Gordon equations are separable. In this form it is asymptotically consistent with comoving frame.

  10. Metric half-span model support system

    NASA Technical Reports Server (NTRS)

    Jackson, C. M., Jr.; Dollyhigh, S. M.; Shaw, D. S. (Inventor)

    1982-01-01

    A model support system used to support a model in a wind tunnel test section is described. The model comprises a metric, or measured, half-span supported by a nonmetric, or nonmeasured half-span which is connected to a sting support. Moments and forces acting on the metric half-span are measured without interference from the support system during a wind tunnel test.

  11. Autonomous Exploration Using an Information Gain Metric

    DTIC Science & Technology

    2016-03-01

    explore and map the unknown area. Exploration frontiers can be described as areas that lie on the boundary between known and unknown space . This...ARL-TR-7638 ● MAR 2016 US Army Research Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung...Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung, Jason M Gregory, and John G Rogers Computational and

  12. Wave equation on spherically symmetric Lorentzian metrics

    SciTech Connect

    Bokhari, Ashfaque H.; Al-Dweik, Ahmad Y.; Zaman, F. D.; Kara, A. H.; Karim, M.

    2011-06-15

    Wave equation on a general spherically symmetric spacetime metric is constructed. Noether symmetries of the equation in terms of explicit functions of {theta} and {phi} are derived subject to certain differential constraints. By restricting the metric to flat Friedman case the Noether symmetries of the wave equation are presented. Invertible transformations are constructed from a specific subalgebra of these Noether symmetries to convert the wave equation with variable coefficients to the one with constant coefficients.

  13. Software Metrics Useful Tools or Wasted Measurements

    DTIC Science & Technology

    1990-05-01

    shared by the developers of the field of software metrics. Capers Jones, Chairman of Software Productivity Research, Inc. and a noted pioneer in...development efforts in terms of function points. That will give you a basis for measuring productivity. Capers Jones, chairman of Software... Capers Jones, "Building a better metric," Computerworld Extra, 22 (June 20, 1988):39. 24 ALlen J. Albrecht and John E. Gaffney, Jr., "Software Function

  14. GRC GSFC TDRSS Waveform Metrics Report

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.

    2013-01-01

    The report presents software metrics and porting metrics for the GGT Waveform. The porting was from a ground-based COTS SDR, the SDR-3000, to the CoNNeCT JPL SDR. The report does not address any of the Operating Environment (OE) software development, nor the original TDRSS waveform development at GSFC for the COTS SDR. With regard to STRS, the report presents compliance data and lessons learned.

  15. Reproducibility of graph metrics in FMRI networks.

    PubMed

    Telesford, Qawi K; Morgan, Ashley R; Hayasaka, Satoru; Simpson, Sean L; Barret, William; Kraft, Robert A; Mozolic, Jennifer L; Laurienti, Paul J

    2010-01-01

    The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC) statistics and Bland-Altman (BA) plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC = 0.86), global efficiency (ICC = 0.83), path length (ICC = 0.79), and local efficiency (ICC = 0.75); the ICC score for degree was found to be low (ICC = 0.29). ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency, and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  16. Perceptual metrics of individuals with autism provide evidence for disinhibition

    PubMed Central

    Tannan, Vinay; Holden, Jameson K.; Zhang, Zheng; Baranek, Grace T.; Tommerdahl, Mark A.

    2012-01-01

    Adults with autism exhibit inhibitory deficits that are often manifested in behavioral modifications, such as repetitive behaviors, and/or sensory hyper-responsiveness. If such behaviors are the result of a generalized deficiency in inhibitory neurotransmission, then it stands to reason that deficits involving localized cortical-cortical interactions – such as in sensory discrimination tasks – could be detected and quantified. This study exemplifies a newly developed method for quantifying sensory testing metrics. Our novel sensory discrimination tests may provide (a) an effective means for biobehavioral assessment of deficits specific to autism and (b) an efficient and sensitive measure of change following treatment. The sensory discriminative capacity of 10 subjects with autism and 10 controls was compared both before and after short duration adapting stimuli. Specifically, vibrotactile amplitude discriminative capacity was obtained both in the presence and absence of 1 sec adapting stimuli that were delivered 1 sec prior to the comparison stimuli. Although adaptation had a pronounced effect on the amplitude discriminative capacity of the control subjects, little or no impact was observed on the sensory discriminative capacity of the subjects with autism. This lack of impact of the adapting stimuli on the responses of the subjects with autism was interpreted to be consistent with the reduced GABAergic mediated inhibition described in previous reports. One significant aspect of this study is that the methods could prove to be a useful and efficient way to detect specific neural deficits and monitor the efficacy of pharmacological or behavioral treatments in autism. PMID:19360672

  17. Evaluating Algorithm Performance Metrics Tailored for Prognostics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics has taken a center stage in Condition Based Maintenance (CBM) where it is desired to estimate Remaining Useful Life (RUL) of the system so that remedial measures may be taken in advance to avoid catastrophic events or unwanted downtimes. Validation of such predictions is an important but difficult proposition and a lack of appropriate evaluation methods renders prognostics meaningless. Evaluation methods currently used in the research community are not standardized and in many cases do not sufficiently assess key performance aspects expected out of a prognostics algorithm. In this paper we introduce several new evaluation metrics tailored for prognostics and show that they can effectively evaluate various algorithms as compared to other conventional metrics. Specifically four algorithms namely; Relevance Vector Machine (RVM), Gaussian Process Regression (GPR), Artificial Neural Network (ANN), and Polynomial Regression (PR) are compared. These algorithms vary in complexity and their ability to manage uncertainty around predicted estimates. Results show that the new metrics rank these algorithms in different manner and depending on the requirements and constraints suitable metrics may be chosen. Beyond these results, these metrics offer ideas about how metrics suitable to prognostics may be designed so that the evaluation procedure can be standardized. 1

  18. Metrics for Offline Evaluation of Prognostic Performance

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2010-01-01

    Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  19. On Applying the Prognostic Performance Metrics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics performance evaluation has gained significant attention in the past few years. As prognostics technology matures and more sophisticated methods for prognostic uncertainty management are developed, a standardized methodology for performance evaluation becomes extremely important to guide improvement efforts in a constructive manner. This paper is in continuation of previous efforts where several new evaluation metrics tailored for prognostics were introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. Several shortcomings identified, while applying these metrics to a variety of real applications, are also summarized along with discussions that attempt to alleviate these problems. Further, these metrics have been enhanced to include the capability of incorporating probability distribution information from prognostic algorithms as opposed to evaluation based on point estimates only. Several methods have been suggested and guidelines have been provided to help choose one method over another based on probability distribution characteristics. These approaches also offer a convenient and intuitive visualization of algorithm performance with respect to some of these new metrics like prognostic horizon and alpha-lambda performance, and also quantify the corresponding performance while incorporating the uncertainty information.

  20. Experimental constraints on metric and non-metric theories of gravity

    NASA Technical Reports Server (NTRS)

    Will, Clifford M.

    1989-01-01

    Experimental constraints on metric and non-metric theories of gravitation are reviewed. Tests of the Einstein Equivalence Principle indicate that only metric theories of gravity are likely to be viable. Solar system experiments constrain the parameters of the weak field, post-Newtonian limit to be close to the values predicted by general relativity. Future space experiments will provide further constraints on post-Newtonian gravity.

  1. Baby universe metric equivalent to an interior black-hole metric

    NASA Astrophysics Data System (ADS)

    González-Díaz, Pedro F.

    1991-06-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent to the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed.

  2. Application of bilateral filtration with weight coefficients for similarity metric calculation in optical flow computation algorithm

    NASA Astrophysics Data System (ADS)

    Panin, S. V.; Titkov, V. V.; Lyubutin, P. S.; Chemezov, V. O.; Eremin, A. V.

    2016-11-01

    Application of weight coefficients of the bilateral filter used to determine weighted similarity metrics of image ranges in optical flow computation algorithm that employs 3-dimension recursive search (3DRS) was investigated. By testing the algorithm applying images taken from the public test database Middlebury benchmark, the effectiveness of this weighted similarity metrics for solving the image processing problem was demonstrated. The necessity of matching the equation parameter values when calculating the weight coefficients aimed at taking into account image texture features was proved for reaching the higher noise resistance under the vector field construction. The adaptation technique which allows excluding manual determination of parameter values was proposed and its efficiency was demonstrated.

  3. Investigating simulation-based metrics for characterizing linear iterative reconstruction in digital breast tomosynthesis.

    PubMed

    Rose, Sean D; Sanchez, Adrian A; Sidky, Emil Y; Pan, Xiaochuan

    2017-09-01

    Simulation-based image quality metrics are adapted and investigated for characterizing the parameter dependences of linear iterative image reconstruction for DBT. Three metrics based on a 2D DBT simulation are investigated: (1) a root-mean-square-error (RMSE) between the test phantom and reconstructed image, (2) a gradient RMSE where the comparison is made after taking a spatial gradient of both image and phantom, and (3) a region-of-interest (ROI) Hotelling observer (HO) for signal-known-exactly/background-known-exactly (SKE/BKE) and signal-known-exactly/background-known-statistically (SKE/BKS) detection tasks. Two simulation studies are performed using the aforementioned metrics, varying voxel aspect ratio, and regularization strength for two types of Tikhonov-regularized least-squares optimization. The RMSE metrics are applied to a 2D test phantom with resolution bar patterns at varying angles, and the ROI-HO metric is applied to two tasks relevant to DBT: lesion detection, modeled by use of a large, low-contrast signal, and microcalcification detection, modeled by use of a small, high-contrast signal. The RMSE metric trends are compared with visual assessment of the reconstructed bar-pattern phantom. The ROI-HO metric trends are compared with 3D reconstructed images from ACR phantom data acquired with a Hologic Selenia Dimensions DBT system. Sensitivity of the image RMSE to mean pixel value is found to limit its applicability to the assessment of DBT image reconstruction. The image gradient RMSE is insensitive to mean pixel value and appears to track better with subjective visualization of the reconstructed bar-pattern phantom. The ROI-HO metric shows an increasing trend with regularization strength for both forms of Tikhonov-regularized least-squares; however, this metric saturates at intermediate regularization strength indicating a point of diminishing returns for signal detection. Visualization with the reconstructed ACR phantom images appear to show a

  4. Graphlet Based Metrics for the Comparison of Gene Regulatory Networks

    PubMed Central

    Martin, Alberto J. M.; Dominguez, Calixto; Contreras-Riquelme, Sebastián; Holmes, David S.; Perez-Acle, Tomas

    2016-01-01

    Understanding the control of gene expression remains one of the main challenges in the post-genomic era. Accordingly, a plethora of methods exists to identify variations in gene expression levels. These variations underlay almost all relevant biological phenomena, including disease and adaptation to environmental conditions. However, computational tools to identify how regulation changes are scarce. Regulation of gene expression is usually depicted in the form of a gene regulatory network (GRN). Structural changes in a GRN over time and conditions represent variations in the regulation of gene expression. Like other biological networks, GRNs are composed of basic building blocks called graphlets. As a consequence, two new metrics based on graphlets are proposed in this work: REConstruction Rate (REC) and REC Graphlet Degree (RGD). REC determines the rate of graphlet similarity between different states of a network and RGD identifies the subset of nodes with the highest topological variation. In other words, RGD discerns how th GRN was rewired. REC and RGD were used to compare the local structure of nodes in condition-specific GRNs obtained from gene expression data of Escherichia coli, forming biofilms and cultured in suspension. According to our results, most of the network local structure remains unaltered in the two compared conditions. Nevertheless, changes reported by RGD necessarily imply that a different cohort of regulators (i.e. transcription factors (TFs)) appear on the scene, shedding light on how the regulation of gene expression occurs when E. coli transits from suspension to biofilm. Consequently, we propose that both metrics REC and RGD should be adopted as a quantitative approach to conduct differential analyses of GRNs. A tool that implements both metrics is available as an on-line web server (http://dlab.cl/loto). PMID:27695050

  5. Graphlet Based Metrics for the Comparison of Gene Regulatory Networks.

    PubMed

    Martin, Alberto J M; Dominguez, Calixto; Contreras-Riquelme, Sebastián; Holmes, David S; Perez-Acle, Tomas

    2016-01-01

    Understanding the control of gene expression remains one of the main challenges in the post-genomic era. Accordingly, a plethora of methods exists to identify variations in gene expression levels. These variations underlay almost all relevant biological phenomena, including disease and adaptation to environmental conditions. However, computational tools to identify how regulation changes are scarce. Regulation of gene expression is usually depicted in the form of a gene regulatory network (GRN). Structural changes in a GRN over time and conditions represent variations in the regulation of gene expression. Like other biological networks, GRNs are composed of basic building blocks called graphlets. As a consequence, two new metrics based on graphlets are proposed in this work: REConstruction Rate (REC) and REC Graphlet Degree (RGD). REC determines the rate of graphlet similarity between different states of a network and RGD identifies the subset of nodes with the highest topological variation. In other words, RGD discerns how th GRN was rewired. REC and RGD were used to compare the local structure of nodes in condition-specific GRNs obtained from gene expression data of Escherichia coli, forming biofilms and cultured in suspension. According to our results, most of the network local structure remains unaltered in the two compared conditions. Nevertheless, changes reported by RGD necessarily imply that a different cohort of regulators (i.e. transcription factors (TFs)) appear on the scene, shedding light on how the regulation of gene expression occurs when E. coli transits from suspension to biofilm. Consequently, we propose that both metrics REC and RGD should be adopted as a quantitative approach to conduct differential analyses of GRNs. A tool that implements both metrics is available as an on-line web server (http://dlab.cl/loto).

  6. Characterizing Hurricane Tracks Using Multiple Statistical Metrics

    NASA Astrophysics Data System (ADS)

    Hui, K. L.; Emanuel, K.; Ravela, S.

    2015-12-01

    Historical tropical cyclone tracks reveal a wide range of shapes and speeds over different ocean basins. However, they have only been accurately recorded in the last few decades, limiting their representativeness to only a subset of possible tracks in a changing large-scale environment. Taking into account various climate conditions, synthetic tracks can be generated to produce a much larger sample of cyclone tracks to understand variability of cyclone activity and assess future changes. To evaluate how well the synthetic tracks capture the characteristics of the historical tracks, several statistical metrics have been developed to characterize and compare their shapes and movements. In one metric, the probability density functions of storm locations are estimated by modeling the position of the storms as a Markov chain. Another metric is constructed to capture the mutual information between two variables such as velocity and curvature. These metrics are then applied to the synthetic and historical tracks to determine if the latter are plausibly a subset of the former. Bootstrap sampling is used in applying the metrics to the synthetic tracks to accurately compare them with the historical tracks given the large sample size difference. If we confirm that the synthetic tracks capture the variability of the historical ones, high confidence intervals can be determined from the much larger set of synthetic tracks to look for highly unusual tracks and to assess their probability of occurrence.

  7. Performance Metrics, Error Modeling, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Nearing, Grey S.; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Tang, Ling

    2016-01-01

    A common set of statistical metrics has been used to summarize the performance of models or measurements-­ the most widely used ones being bias, mean square error, and linear correlation coefficient. They assume linear, additive, Gaussian errors, and they are interdependent, incomplete, and incapable of directly quantifying un­certainty. The authors demonstrate that these metrics can be directly derived from the parameters of the simple linear error model. Since a correct error model captures the full error information, it is argued that the specification of a parametric error model should be an alternative to the metrics-based approach. The error-modeling meth­odology is applicable to both linear and nonlinear errors, while the metrics are only meaningful for linear errors. In addition, the error model expresses the error structure more naturally, and directly quantifies uncertainty. This argument is further explained by highlighting the intrinsic connections between the performance metrics, the error model, and the joint distribution between the data and the reference.

  8. Parameterized centrality metric for network analysis

    NASA Astrophysics Data System (ADS)

    Ghosh, Rumi; Lerman, Kristina

    2011-06-01

    A variety of metrics have been proposed to measure the relative importance of nodes in a network. One of these, alpha-centrality [P. Bonacich, Am. J. Sociol.0002-960210.1086/228631 92, 1170 (1987)], measures the number of attenuated paths that exist between nodes. We introduce a normalized version of this metric and use it to study network structure, for example, to rank nodes and find community structure of the network. Specifically, we extend the modularity-maximization method for community detection to use this metric as the measure of node connectivity. Normalized alpha-centrality is a powerful tool for network analysis, since it contains a tunable parameter that sets the length scale of interactions. Studying how rankings and discovered communities change when this parameter is varied allows us to identify locally and globally important nodes and structures. We apply the proposed metric to several benchmark networks and show that it leads to better insights into network structure than alternative metrics.

  9. Homology-independent metrics for comparative genomics.

    PubMed

    Coutinho, Tarcisio José Domingos; Franco, Glória Regina; Lobo, Francisco Pereira

    2015-01-01

    A mainstream procedure to analyze the wealth of genomic data available nowadays is the detection of homologous regions shared across genomes, followed by the extraction of biological information from the patterns of conservation and variation observed in such regions. Although of pivotal importance, comparative genomic procedures that rely on homology inference are obviously not applicable if no homologous regions are detectable. This fact excludes a considerable portion of "genomic dark matter" with no significant similarity - and, consequently, no inferred homology to any other known sequence - from several downstream comparative genomic methods. In this review we compile several sequence metrics that do not rely on homology inference and can be used to compare nucleotide sequences and extract biologically meaningful information from them. These metrics comprise several compositional parameters calculated from sequence data alone, such as GC content, dinucleotide odds ratio, and several codon bias metrics. They also share other interesting properties, such as pervasiveness (patterns persist on smaller scales) and phylogenetic signal. We also cite examples where these homology-independent metrics have been successfully applied to support several bioinformatics challenges, such as taxonomic classification of biological sequences without homology inference. They where also used to detect higher-order patterns of interactions in biological systems, ranging from detecting coevolutionary trends between the genomes of viruses and their hosts to characterization of gene pools of entire microbial communities. We argue that, if correctly understood and applied, homology-independent metrics can add important layers of biological information in comparative genomic studies without prior homology inference.

  10. Cleanroom Energy Efficiency: Metrics and Benchmarks

    SciTech Connect

    International SEMATECH Manufacturing Initiative; Mathew, Paul A.; Tschudi, William; Sartor, Dale; Beasley, James

    2010-07-07

    Cleanrooms are among the most energy-intensive types of facilities. This is primarily due to the cleanliness requirements that result in high airflow rates and system static pressures, as well as process requirements that result in high cooling loads. Various studies have shown that there is a wide range of cleanroom energy efficiencies and that facility managers may not be aware of how energy efficient their cleanroom facility can be relative to other cleanroom facilities with the same cleanliness requirements. Metrics and benchmarks are an effective way to compare one facility to another and to track the performance of a given facility over time. This article presents the key metrics and benchmarks that facility managers can use to assess, track, and manage their cleanroom energy efficiency or to set energy efficiency targets for new construction. These include system-level metrics such as air change rates, air handling W/cfm, and filter pressure drops. Operational data are presented from over 20 different cleanrooms that were benchmarked with these metrics and that are part of the cleanroom benchmark dataset maintained by Lawrence Berkeley National Laboratory (LBNL). Overall production efficiency metrics for cleanrooms in 28 semiconductor manufacturing facilities in the United States and recorded in the Fabs21 database are also presented.

  11. Vestibular influence on auditory metrical interpretation.

    PubMed

    Phillips-Silver, Jessica; Trainor, Laurel J

    2008-06-01

    When we move to music we feel the beat, and this feeling can shape the sound we hear. Previous studies have shown that when people listen to a metrically ambiguous rhythm pattern, moving the body on a certain beat--adults, by actively bouncing themselves in synchrony with the experimenter, and babies, by being bounced passively in the experimenter's arms--can bias their auditory metrical representation so that they interpret the pattern in a corresponding metrical form [Phillips-Silver, J., & Trainor, L. J. (2005). Feeling the beat: Movement influences infant rhythm perception. Science, 308, 1430; Phillips-Silver, J., & Trainor, L. J. (2007). Hearing what the body feels: Auditory encoding of rhythmic movement. Cognition, 105, 533-546]. The present studies show that in adults, as well as in infants, metrical encoding of rhythm can be biased by passive motion. Furthermore, because movement of the head alone affected auditory encoding whereas movement of the legs alone did not, we propose that vestibular input may play a key role in the effect of movement on auditory rhythm processing. We discuss possible cortical and subcortical sites for the integration of auditory and vestibular inputs that may underlie the interaction between movement and auditory metrical rhythm perception.

  12. Performance Metrics, Error Modeling, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Nearing, Grey S.; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Tang, Ling

    2016-01-01

    A common set of statistical metrics has been used to summarize the performance of models or measurements-­ the most widely used ones being bias, mean square error, and linear correlation coefficient. They assume linear, additive, Gaussian errors, and they are interdependent, incomplete, and incapable of directly quantifying un­certainty. The authors demonstrate that these metrics can be directly derived from the parameters of the simple linear error model. Since a correct error model captures the full error information, it is argued that the specification of a parametric error model should be an alternative to the metrics-based approach. The error-modeling meth­odology is applicable to both linear and nonlinear errors, while the metrics are only meaningful for linear errors. In addition, the error model expresses the error structure more naturally, and directly quantifies uncertainty. This argument is further explained by highlighting the intrinsic connections between the performance metrics, the error model, and the joint distribution between the data and the reference.

  13. Metrics for the NASA Airspace Systems Program

    NASA Technical Reports Server (NTRS)

    Smith, Jeremy C.; Neitzke, Kurt W.

    2009-01-01

    This document defines an initial set of metrics for use by the NASA Airspace Systems Program (ASP). ASP consists of the NextGen-Airspace Project and the NextGen-Airportal Project. The work in each project is organized along multiple, discipline-level Research Focus Areas (RFAs). Each RFA is developing future concept elements in support of the Next Generation Air Transportation System (NextGen), as defined by the Joint Planning and Development Office (JPDO). In addition, a single, system-level RFA is responsible for integrating concept elements across RFAs in both projects and for assessing system-wide benefits. The primary purpose of this document is to define a common set of metrics for measuring National Airspace System (NAS) performance before and after the introduction of ASP-developed concepts for NextGen as the system handles increasing traffic. The metrics are directly traceable to NextGen goals and objectives as defined by the JPDO and hence will be used to measure the progress of ASP research toward reaching those goals. The scope of this document is focused on defining a common set of metrics for measuring NAS capacity, efficiency, robustness, and safety at the system-level and at the RFA-level. Use of common metrics will focus ASP research toward achieving system-level performance goals and objectives and enable the discipline-level RFAs to evaluate the impact of their concepts at the system level.

  14. 48 CFR 611.002-70 - Metric system implementation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... specifications, standards, supplies and services. Hybrid system means the use of both traditional and hard metric... an application or meaning depending substantially on some measured quantity. For example, measurement... possible. Alternatives to hard metric are soft, dual and hybrid metric terms. The Metric Handbook...

  15. Codes in W*-Metric Spaces: Theory and Examples

    ERIC Educational Resources Information Center

    Bumgardner, Christopher J.

    2011-01-01

    We introduce a "W*"-metric space, which is a particular approach to non-commutative metric spaces where a "quantum metric" is defined on a von Neumann algebra. We generalize the notion of a quantum code and quantum error correction to the setting of finite dimensional "W*"-metric spaces, which includes codes and error correction for classical…

  16. 22 CFR 518.15 - Metric system of measurement.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... establish a date or dates in consultation with the Secretary of Commerce, when the metric system of... 22 Foreign Relations 2 2014-04-01 2014-04-01 false Metric system of measurement. 518.15 Section... ORGANIZATIONS Pre-Award Requirements § 518.15 Metric system of measurement. The Metric Conversion Act, as...

  17. 14 CFR 1260.115 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Metric system of measurement. 1260.115....115 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  18. 22 CFR 518.15 - Metric system of measurement.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... a date or dates in consultation with the Secretary of Commerce, when the metric system of... 22 Foreign Relations 2 2013-04-01 2009-04-01 true Metric system of measurement. 518.15 Section 518... Pre-Award Requirements § 518.15 Metric system of measurement. The Metric Conversion Act, as amended by...

  19. 20 CFR 435.15 - Metric system of measurement.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Metric system of measurement. 435.15 Section..., AND COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 435.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  20. NASA education briefs for the classroom. Metrics in space

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  1. 43 CFR 12.915 - Metric system of measurement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 1 2013-10-01 2013-10-01 false Metric system of measurement. 12.915... Requirements § 12.915 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement...

  2. 29 CFR 95.15 - Metric system of measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Metric system of measurement. 95.15 Section 95.15 Labor... Requirements § 95.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the preferred measurement...

  3. 34 CFR 74.15 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Metric system of measurement. 74.15 Section 74.15... Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  4. 14 CFR § 1274.206 - Metric Conversion Act.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... metric system is the preferred measurement system for U.S. trade and commerce. NASA's policy with respect to the metric measurement system is stated in NPD 8010.2, Use of the Metric System of Measurement in... 14 Aeronautics and Space 5 2014-01-01 2014-01-01 false Metric Conversion Act. § 1274.206 Section...

  5. 2 CFR 215.15 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false Metric system of measurement. 215.15 Section... ORGANIZATIONS (OMB CIRCULAR A-110) Pre-Award Requirements § 215.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  6. 20 CFR 435.15 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Metric system of measurement. 435.15 Section..., AND COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 435.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  7. The Metric System: America Measures Up. 1979 Edition.

    ERIC Educational Resources Information Center

    Anderson, Glen; Gallagher, Paul

    This training manual is designed to introduce and assist naval personnel in the conversion from the English system of measurement to the metric system of measurement. The book tells what the "move to metrics" is all about, and details why the change to the metric system is necessary. Individual chapters are devoted to how the metric system will…

  8. 20 CFR 435.15 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Metric system of measurement. 435.15 Section..., AND COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 435.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  9. 34 CFR 74.15 - Metric system of measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Metric system of measurement. 74.15 Section 74.15... Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  10. 49 CFR 19.15 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false Metric system of measurement. 19.15 Section 19.15... Requirements § 19.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the preferred measurement...

  11. 49 CFR 19.15 - Metric system of measurement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false Metric system of measurement. 19.15 Section 19.15... Requirements § 19.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the preferred measurement...

  12. 34 CFR 74.15 - Metric system of measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 1 2013-07-01 2013-07-01 false Metric system of measurement. 74.15 Section 74.15... Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  13. 22 CFR 518.15 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 2 2011-04-01 2009-04-01 true Metric system of measurement. 518.15 Section 518... Pre-Award Requirements § 518.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the...

  14. 20 CFR 435.15 - Metric system of measurement.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Metric system of measurement. 435.15 Section..., AND COMMERCIAL ORGANIZATIONS Pre-Award Requirements § 435.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  15. 22 CFR 518.15 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 2 2012-04-01 2009-04-01 true Metric system of measurement. 518.15 Section 518... Pre-Award Requirements § 518.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the...

  16. 34 CFR 74.15 - Metric system of measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 1 2014-07-01 2014-07-01 false Metric system of measurement. 74.15 Section 74.15... Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  17. 43 CFR 12.915 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 1 2012-10-01 2011-10-01 true Metric system of measurement. 12.915... Requirements § 12.915 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement...

  18. 14 CFR 1260.115 - Metric system of measurement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Metric system of measurement. 1260.115....115 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement system for U.S...

  19. 2 CFR 215.15 - Metric system of measurement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 2 Grants and Agreements 1 2013-01-01 2013-01-01 false Metric system of measurement. 215.15 Section... ORGANIZATIONS (OMB CIRCULAR A-110) Pre-Award Requirements § 215.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  20. 49 CFR 19.15 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false Metric system of measurement. 19.15 Section 19.15... Requirements § 19.15 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205), declares that the metric system is the preferred measurement...

  1. 43 CFR 12.915 - Metric system of measurement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 1 2011-10-01 2011-10-01 false Metric system of measurement. 12.915... Requirements § 12.915 Metric system of measurement. The Metric Conversion Act, as amended by the Omnibus Trade and Competitiveness Act (15 U.S.C. 205) declares that the metric system is the preferred measurement...

  2. 2 CFR 215.15 - Metric system of measurement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 2 Grants and Agreements 1 2012-01-01 2012-01-01 false Metric system of measurement. 215.15 Section... ORGANIZATIONS (OMB CIRCULAR A-110) Pre-Award Requirements § 215.15 Metric system of measurement. The Metric... metric system is the preferred measurement system for U.S. trade and commerce. The Act requires each...

  3. Codes in W*-Metric Spaces: Theory and Examples

    ERIC Educational Resources Information Center

    Bumgardner, Christopher J.

    2011-01-01

    We introduce a "W*"-metric space, which is a particular approach to non-commutative metric spaces where a "quantum metric" is defined on a von Neumann algebra. We generalize the notion of a quantum code and quantum error correction to the setting of finite dimensional "W*"-metric spaces, which includes codes and error correction for classical…

  4. Metrics. A Basic Core Curriculum for Teaching Metrics to Vocational Students.

    ERIC Educational Resources Information Center

    Albracht, James; Simmons, A. D.

    This core curriculum contains five units for use in teaching metrics to vocational students. Included in the first unit are a series of learning activities to familiarize students with the terminology of metrics, including the prefixes and their values. Measures of distance and speed are covered. Discussed next are measures of volume used with…

  5. Elementary Metric Curriculum - Project T.I.M.E. (Timely Implementation of Metric Education). Part II.

    ERIC Educational Resources Information Center

    Community School District 18, Brooklyn, NY.

    This is the second part of a two-part teacher's manual for an ISS-based elementary school course in the metric system. Behavioral objectives and student activities are included. Topics include: (1) capacity; (2) calculation of volume and surface area of cylinders and cones; (3) mass; (4) temperature; and (5) metric conversions. (BB)

  6. Implementing the Metric System in Personal and Public Service Occupations. Metric Implementation Guide.

    ERIC Educational Resources Information Center

    Banks, Wilson P.; And Others

    Addressed to the personal and public service occupations teacher, this guide is intended to provide appropriate information, viewpoints, and attitudes regarding the metric system and to make suggestions regarding presentation of the material in the classroom. An introductory section on teaching suggestions emphasizes the need for a "think metric"…

  7. Elementary Metric Curriculum - Project T.I.M.E. (Timely Implementation of Metric Education). Part II.

    ERIC Educational Resources Information Center

    Community School District 18, Brooklyn, NY.

    This is the second part of a two-part teacher's manual for an ISS-based elementary school course in the metric system. Behavioral objectives and student activities are included. Topics include: (1) capacity; (2) calculation of volume and surface area of cylinders and cones; (3) mass; (4) temperature; and (5) metric conversions. (BB)

  8. Metrication report to the Congress. 1991 activities and 1992 plans

    NASA Technical Reports Server (NTRS)

    1991-01-01

    During 1991, NASA approved a revised metric use policy and developed a NASA Metric Transition Plan. This Plan targets the end of 1995 for completion of NASA's metric initiatives. This Plan also identifies future programs that NASA anticipates will use the metric system of measurement. Field installations began metric transition studies in 1991 and will complete them in 1992. Half of NASA's Space Shuttle payloads for 1991, and almost all such payloads for 1992, have some metric-based elements. In 1992, NASA will begin assessing requirements for space-quality piece parts fabricated to U.S. metric standards, leading to development and qualification of high priority parts.

  9. Critical insights for a sustainability framework to address integrated community water services: Technical metrics and approaches.

    PubMed

    Xue, Xiaobo; Schoen, Mary E; Ma, Xin Cissy; Hawkins, Troy R; Ashbolt, Nicholas J; Cashdollar, Jennifer; Garland, Jay

    2015-06-15

    Planning for sustainable community water systems requires a comprehensive understanding and assessment of the integrated source-drinking-wastewater systems over their life-cycles. Although traditional life cycle assessment and similar tools (e.g. footprints and emergy) have been applied to elements of these water services (i.e. water resources, drinking water, stormwater or wastewater treatment alone), we argue for the importance of developing and combining the system-based tools and metrics in order to holistically evaluate the complete water service system based on the concept of integrated resource management. We analyzed the strengths and weaknesses of key system-based tools and metrics, and discuss future directions to identify more sustainable municipal water services. Such efforts may include the need for novel metrics that address system adaptability to future changes and infrastructure robustness. Caution is also necessary when coupling fundamentally different tools so to avoid misunderstanding and consequently misleading decision-making. Published by Elsevier Ltd.

  10. Deformation of Kahler metrics to Kahler-Einstein metrics on compact Kahler manifolds

    SciTech Connect

    Cao, H.D.

    1986-01-01

    Metric deformation methods are used to give a more geometric proof of the well known Calabi conjectures of Kahler geometry. More precisely, it is proven that given any closed (1,1) form on a compact Kahler manifold M, which represents the first Chern class of M, one can deform the initial metric by certain Hamilton's type equation to the limit Kahler metric which has the given (1,1) form as its Ricci form. In case the first Chern class of M is negative, the appropriate initial metric can also be deformed along the parabolic Einstein equation to the unique Kahler-Einstein metric. The method involves the study of the a priori estimates and the asymptotic behavior of the solutions to the evolution equations. It is remarked that the above Calabi conjectures were first proved by Yau in 1976 by solving certain complex Monge-Ampere equations.

  11. New Born-Infeld and Dp-brane actions under 2-metric and 3-metric prescriptions

    SciTech Connect

    Miao Yangang

    2007-04-15

    The parent action method is utilized to the Born-Infeld and Dp-brane theories. Various new forms of Born-Infeld and Dp-brane actions are derived by using this systematic approach, in which both the already known 2-metric and newly proposed 3-metric prescriptions are considered. An auxiliary worldvolume tensor field, denoted by {omega}{sub {mu}}{sub {nu}}, is introduced and treated probably as an additional worldvolume metric because it plays a similar role to that of the auxiliary worldvolume (also called intrinsic) metric {gamma}{sub {mu}}{sub {nu}}. Some properties, such as duality, permutation and Weyl invariance as a local worldvolume symmetry of the new forms are analyzed. In particular, a new symmetry, i.e. the double Weyl invariance is discovered in 3-metric forms.

  12. When is a metric not a metric? Remarks on direct curve comparison in bioequivalence studies.

    PubMed

    Jawień, Wojciech

    2009-06-01

    The majority of measures proposed to date for direct curve comparison in bioequivalence studies were investigated. These measures have often been called metrics, but in most cases this was incorrect in the mathematical sense. It was demonstrated, with a set of counter-examples, that the axioms of a metric are fulfilled only for the integral p-metric and some of its transforms. The Rescigno index and two other measures devised by Polli and McLean are the semi-metrics, lacking the triangle inequality, while others also lack symmetry. The use of the p-metric is therefore recommended, and statistical analysis is suggested as a point at which the scaling of differences might be carried out.

  13. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  14. Extraretinal signal metrics in multiple-saccade sequences.

    PubMed

    Collins, Thérèse

    2010-12-06

    Executing sequences of memory-guided movements requires combining sensory information with information about previously made movements. In the oculomotor system, extraretinal information must be combined with stored visual information about target location. The use of extraretinal signals in oculomotor planning can be probed in the double-step task. Using this task and a multiple-step version, the present study examined whether an extraretinal signal was used on every trial, whether its metrics represented desired or actual eye displacement, and whether it was best characterized as a direct estimate of orbital eye position or a vector representation of eye displacement. The results show that accurate information, including saccadic adaptation, about the first saccade is used to plan the second saccade. Furthermore, with multiple saccades, endpoint variability increases with the number of saccades. Controls ruled out that this was due to the perceptual or memory requirements of storing several target locations. Instead, each memory-guided movement depends on an internal copy of an executed movement, which may present a small discrepancy with the actual movement. Increasing the number of estimates increases the variability because this small discrepancy accumulates over several saccades. Such accumulation is compatible with a corollary discharge signal carrying metric information about saccade vectors.

  15. Evaluating Search Engine Relevance with Click-Based Metrics

    NASA Astrophysics Data System (ADS)

    Radlinski, Filip; Kurup, Madhu; Joachims, Thorsten

    Automatically judging the quality of retrieval functions based on observable user behavior holds promise for making retrieval evaluation faster, cheaper, and more user centered. However, the relationship between observable user behavior and retrieval quality is not yet fully understood. In this chapter, we expand upon, Radlinski et al. (How does clickthrough data reflect retrieval quality, In Proceedings of the ACM Conference on Information and Knowledge Management (CIKM), 43-52, 2008), presenting a sequence of studies investigating this relationship for an operational search engine on the arXiv.org e-print archive. We find that none of the eight absolute usage metrics we explore (including the number of clicks observed, the frequency with which users reformulate their queries, and how often result sets are abandoned) reliably reflect retrieval quality for the sample sizes we consider. However, we find that paired experiment designs adapted from sensory analysis produce accurate and reliable statements about the relative quality of two retrieval functions. In particular, we investigate two paired comparison tests that analyze clickthrough data from an interleaved presentation of ranking pairs, and find that both give accurate and consistent results. We conclude that both paired comparison tests give substantially more accurate and sensitive evaluation results than the absolute usage metrics in our domain.

  16. Enhanced Accident Tolerant LWR Fuels: Metrics Development

    SciTech Connect

    Shannon Bragg-Sitton; Lori Braase; Rose Montgomery; Chris Stanek; Robert Montgomery; Lance Snead; Larry Ott; Mike Billone

    2013-09-01

    The Department of Energy (DOE) Fuel Cycle Research and Development (FCRD) Advanced Fuels Campaign (AFC) is conducting research and development on enhanced Accident Tolerant Fuels (ATF) for light water reactors (LWRs). This mission emphasizes the development of novel fuel and cladding concepts to replace the current zirconium alloy-uranium dioxide (UO2) fuel system. The overall mission of the ATF research is to develop advanced fuels/cladding with improved performance, reliability and safety characteristics during normal operations and accident conditions, while minimizing waste generation. The initial effort will focus on implementation in operating reactors or reactors with design certifications. To initiate the development of quantitative metrics for ATR, a LWR Enhanced Accident Tolerant Fuels Metrics Development Workshop was held in October 2012 in Germantown, MD. This paper summarizes the outcome of that workshop and the current status of metrics development for LWR ATF.

  17. Evaluation metric of an image understanding result

    NASA Astrophysics Data System (ADS)

    Hemery, Baptiste; Laurent, Helene; Emile, Bruno; Rosenberger, Christophe

    2015-01-01

    Image processing algorithms include methods that process images from their acquisition to the extraction of useful information for a given application. Among interpretation algorithms, some are designed to detect, localize, and identify one or several objects in an image. The problem addressed is the evaluation of the interpretation results of an image or a video given an associated ground truth. Challenges are multiple, such as the comparison of algorithms, evaluation of an algorithm during its development, or the definition of its optimal settings. We propose a new metric for evaluating the interpretation result of an image. The advantage of the proposed metric is to evaluate a result by taking into account the quality of the localization, recognition, and detection of objects of interest in the image. Several parameters allow us to change the behavior of this metric for a given application. Its behavior has been tested on a large database and showed interesting results.

  18. Validation of Metrics as Error Predictors

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  19. Metric learning for automatic sleep stage classification.

    PubMed

    Phan, Huy; Do, Quan; Do, The-Luan; Vu, Duc-Lung

    2013-01-01

    We introduce in this paper a metric learning approach for automatic sleep stage classification based on single-channel EEG data. We show that learning a global metric from training data instead of using the default Euclidean metric, the k-nearest neighbor classification rule outperforms state-of-the-art methods on Sleep-EDF dataset with various classification settings. The overall accuracy for Awake/Sleep and 4-class classification setting are 98.32% and 94.49% respectively. Furthermore, the superior accuracy is achieved by performing classification on a low-dimensional feature space derived from time and frequency domains and without the need for artifact removal as a preprocessing step.

  20. An informative confidence metric for ATR.

    SciTech Connect

    Bow, Wallace Johnston Jr.; Richards, John Alfred; Bray, Brian Kenworthy

    2003-03-01

    Automatic or assisted target recognition (ATR) is an important application of synthetic aperture radar (SAR). Most ATR researchers have focused on the core problem of declaration-that is, detection and identification of targets of interest within a SAR image. For ATR declarations to be of maximum value to an image analyst, however, it is essential that each declaration be accompanied by a reliability estimate or confidence metric. Unfortunately, the need for a clear and informative confidence metric for ATR has generally been overlooked or ignored. We propose a framework and methodology for evaluating the confidence in an ATR system's declarations and competing target hypotheses. Our proposed confidence metric is intuitive, informative, and applicable to a broad class of ATRs. We demonstrate that seemingly similar ATRs may differ fundamentally in the ability-or inability-to identify targets with high confidence.

  1. Metric Learning for Hyperspectral Image Segmentation

    NASA Technical Reports Server (NTRS)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  2. Development of testing metrics for military robotics

    NASA Astrophysics Data System (ADS)

    Resendes, Raymond J.

    1993-05-01

    The use of robotics or unmanned systems offers significant benefits to the military user by enhancing mobility, logistics, material handling, command and control, reconnaissance, and protection. The evaluation and selection process for the procurement of an unmanned robotic system involves comparison of performance and physical characteristics such as operating environment, application, payloads and performance criteria. Testing an unmanned system for operation in an unstructured environment using emerging technologies, which have not yet been fully tested, presents unique challenges for the testing community. Standard metrics, test procedures, terminologies, and methodologies simplify comparison of different systems. A procedure was developed to standardize the test and evaluation process for UGVs. This procedure breaks the UGV into three components: the platform, the payload, and the command and control link. Standardized metrics were developed for these components which permit unbiased comparison of different systems. The development of these metrics and their application will be presented.

  3. Social Metrics Applied to Smart Tourism

    NASA Astrophysics Data System (ADS)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  4. Computing and Using Metrics in the ADS

    NASA Astrophysics Data System (ADS)

    Henneken, E. A.; Accomazzi, A.; Kurtz, M. J.; Grant, C. S.; Thompson, D.; Luker, J.; Chyla, R.; Holachek, A.; Murray, S. S.

    2015-04-01

    Finding measures for research impact, be it for individuals, institutions, instruments, or projects, has gained a lot of popularity. There are more papers written than ever on new impact measures, and problems with existing measures are being pointed out on a regular basis. Funding agencies require impact statistics in their reports, job candidates incorporate them in their resumes, and publication metrics have even been used in at least one recent court case. To support this need for research impact indicators, the SAO/NASA Astrophysics Data System (ADS) has developed a service that provides a broad overview of various impact measures. In this paper we discuss how the ADS can be used to quench the thirst for impact measures. We will also discuss a couple of the lesser-known indicators in the metrics overview and the main issues to be aware of when compiling publication-based metrics in the ADS, namely author name ambiguity and citation incompleteness.

  5. Safety and Mission Assurance Performance Metric

    NASA Technical Reports Server (NTRS)

    Holsomback, Jerry; Kuo, Fred; Wade, Jim

    2005-01-01

    The safety and mission assurance (S&MA) performance metric is a method that provides a process through which the managers of a large, complex program can readily understand and assess the accepted risk, the problems, and the associated reliability of the program. Conceived for original use in helping to assure the safety and success of the International Space Station (ISS) program, the S&MA performance metric also can be applied to other large and complex programs and projects. The S&MA-performance-metric data products comprise one or more tables (possibly also one or more graphs) that succinctly display all of the information relevant (and no information that is irrelevant) to management decisions that must be made to assure the safety and success of a program or project, thereby facilitating such decisions.

  6. Interior solution for the Kerr metric

    NASA Astrophysics Data System (ADS)

    Hernandez-Pastora, J. L.; Herrera, L.

    2017-01-01

    A recently presented general procedure to find static and axially symmetric, interior solutions to the Einstein equations is extended to the stationary case, and applied to find an interior solution for the Kerr metric. The solution, which is generated by an anisotropic fluid, verifies the energy conditions for a wide range of values of the parameters, and matches smoothly to the Kerr solution, thereby representing a globally regular model describing a nonspherical and rotating source of gravitational field. In the spherically symmetric limit, our model converges to the well-known incompressible perfect fluid solution. The key stone of our approach is based on an ansatz allowing to define the interior metric in terms of the exterior metric functions evaluated at the boundary source. The physical variables of the energy-momentum tensor are calculated explicitly, as well as the geometry of the source in terms of the relativistic multipole moments.

  7. Landscape pattern metrics and regional assessment

    USGS Publications Warehouse

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  8. Holographic computations of the quantum information metric

    NASA Astrophysics Data System (ADS)

    Trivella, Andrea

    2017-05-01

    In this paper we show how the quantum information metric can be computed holographically using a perturbative approach. In particular when the deformation of the conformal field theory state is induced by a scalar operator the corresponding bulk configuration reduces to a scalar field perturbatively probing the background. We study two concrete examples: a CFT ground state deformed by a primary operator and thermofield double state in d  =  2 deformed by a marginal operator. Finally, we generalize the bulk construction to the case of a multi dimensional parameter space and show that the quantum information metric coincides with the metric of the non-linear sigma model for the corresponding scalar fields.

  9. Metrics for comparison of crystallographic maps

    DOE PAGES

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; ...

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects,more » such as regions of high density, are of interest.« less

  10. Adapting to life: ocean biogeochemical modelling and adaptive remeshing

    NASA Astrophysics Data System (ADS)

    Hill, J.; Popova, E. E.; Ham, D. A.; Piggott, M. D.; Srokosz, M.

    2014-05-01

    An outstanding problem in biogeochemical modelling of the ocean is that many of the key processes occur intermittently at small scales, such as the sub-mesoscale, that are not well represented in global ocean models. This is partly due to their failure to resolve sub-mesoscale phenomena, which play a significant role in vertical nutrient supply. Simply increasing the resolution of the models may be an inefficient computational solution to this problem. An approach based on recent advances in adaptive mesh computational techniques may offer an alternative. Here the first steps in such an approach are described, using the example of a simple vertical column (quasi-1-D) ocean biogeochemical model. We present a novel method of simulating ocean biogeochemical behaviour on a vertically adaptive computational mesh, where the mesh changes in response to the biogeochemical and physical state of the system throughout the simulation. We show that the model reproduces the general physical and biological behaviour at three ocean stations (India, Papa and Bermuda) as compared to a high-resolution fixed mesh simulation and to observations. The use of an adaptive mesh does not increase the computational error, but reduces the number of mesh elements by a factor of 2-3. Unlike previous work the adaptivity metric used is flexible and we show that capturing the physical behaviour of the model is paramount to achieving a reasonable solution. Adding biological quantities to the adaptivity metric further refines the solution. We then show the potential of this method in two case studies where we change the adaptivity metric used to determine the varying mesh sizes in order to capture the dynamics of chlorophyll at Bermuda and sinking detritus at Papa. We therefore demonstrate that adaptive meshes may provide a suitable numerical technique for simulating seasonal or transient biogeochemical behaviour at high vertical resolution whilst minimising the number of elements in the mesh. More

  11. Information on the metric system and related fields

    NASA Technical Reports Server (NTRS)

    Lange, E.

    1976-01-01

    This document contains about 7,600 references on the metric system and conversion to the metric system. These references include all known documents on the metric system as of December 1975, the month of enactment of the Metric Conversion Act of 1975. This bibliography includes books, reports, articles, presentations, periodicals, legislation, motion pictures, TV series, film strips, slides, posters, wall charts, education and training courses, addresses for information, and sources for metric materials and services. A comprehensive index is provided.

  12. Einstein Finsler metrics and killing vector fields on Riemannian manifolds

    NASA Astrophysics Data System (ADS)

    Cheng, XinYue; Shen, ZhongMin

    2017-01-01

    In this paper, we use a Killing form on a Riemannian manifold to construct a class of Finsler metrics. We find equations that characterize Einstein metrics among this class. In particular, we construct a family of Einstein metrics on $S^3$ with ${\\rm Ric} = 2 F^2$, ${\\rm Ric}=0$ and ${\\rm Ric}=- 2 F^2$, respectively. This family of metrics provide an important class of Finsler metrics in dimension three, whose Ricci curvature is a constant, but the flag curvature is not.

  13. A neural net-based approach to software metrics

    NASA Technical Reports Server (NTRS)

    Boetticher, G.; Srinivas, Kankanahalli; Eichmann, David A.

    1992-01-01

    Software metrics provide an effective method for characterizing software. Metrics have traditionally been composed through the definition of an equation. This approach is limited by the fact that all the interrelationships among all the parameters be fully understood. This paper explores an alternative, neural network approach to modeling metrics. Experiments performed on two widely accepted metrics, McCabe and Halstead, indicate that the approach is sound, thus serving as the groundwork for further exploration into the analysis and design of software metrics.

  14. Analytical Methods for Exoplanet Imaging Detection Metrics

    NASA Astrophysics Data System (ADS)

    Garrett, Daniel; Savransky, Dmitry

    2017-01-01

    When designing or simulating exoplanet-finding missions, a selection metric must be used to choose which target stars will be observed. For direct imaging missions, the metric is a function of the planet-star separation and flux ratio as constrained by the instrument's inner and outer working angles and contrast. We present analytical methods for the calculation of two detection metrics: completeness and depth of search. While Monte Carlo methods have typically been used for determining each of these detection metrics, implementing analytical methods in simulation or early stage design yields quicker, more accurate calculations.Completeness is the probability of detecting a planet belonging to the planet population of interest. This metric requires assumptions to be made about the planet population. Probability density functions are assumed for the planetary parameters of semi-major axis, eccentricity, geometric albedo, and planetary radius. Planet-star separation and difference in brightness magnitude or contrast are written as functions of these parameters. A change of variables is performed to get a joint probability density function of planet-star separation and difference in brightness magnitude or contrast. This joint probability density function is marginalized subject to the constraints of the instrument to yield the probability of detecting a planet belonging to the population of interest.Depth of search for direct imaging is the sum of the probability of detecting a planet of given semi-major axis and planetary radius by a given instrument for a target list. This metric does not depend on assumed planet population parameter distributions. A two-dimensional grid of probabilities is generated for each star in the target list. The probability at each point in the grid is found by marginalizing a probability density function of contrast given constant values of semi-major axis and planetary radius subject to the constraints of the instrument.

  15. Metrics of carotid plaque-surface morphology

    NASA Astrophysics Data System (ADS)

    Yim, Peter J.; Demarco, J. Kevin

    2005-04-01

    Studies of the coronary and carotid arteries have found that plaques with irregular surfaces are more likely to produce cardiac infarction and stroke, respectively. The aim of this project was the development of methods for quantifying irregularity of plaque surface. Three metrics for quantifying surface irregularity were developed that are insensitive to variability of vessel diameter. These metrics include (1) Ratio of surface area to square-root of volume (RSASRV) (2) Mean of absolute value of minor principal curvature (MAVMPC) and (3) Radial variation within vessel cross sections (RVWVCS). For computing RVWVCS, a vessel axis was determined by Ordered Region Growing Skeletonization. RVWVCS is the within-group mean-square-error of the distance of the surface to the vessel axis where the vertices are grouped according to their match to the closest point on the vessel axis. These metrics are applied to triangulated surface of the carotid artery in the vicinity of the stenosis. The surface was reconstructed from contrast-enhanced magnetic resonance angiography by the Isosurface Deformable Model. The stenotic region was selected by manual placement of a 2-cm-long bounding box around the region, excluding the external carotid artery if necessary. The metrics were applied to three carotid arteries with a moderate degree of stenosis. These three cases exhibited mild, moderate and severe plaque-surface irregularity, respectively, as determined by visual impression. The ranking of the irregularity of the carotid arteries was in 100% agreement with visual impression for all three metrics. All three metrics should be given further consideration for quantification of plaque-surface irregularity.

  16. Including absorption in Gordon's optical metric

    NASA Astrophysics Data System (ADS)

    Chen, B.; Kantowski, R.

    2009-05-01

    We show that Gordon’s optical metric on a curved spacetime can be generalized to include absorption by allowing the metric to become complex. We demonstrate its use in the realm of geometrical optics by giving three simple examples. We use one of these examples to compute corrected distance-redshift relations for Friedman-Lemaître-Robertson-Walker models in which the cosmic fluid has an associated complex index of refraction that represents grey extinction. We then fit this corrected Hubble curve to the type Ia supernovae data and provide a possible explanation (other than dark energy) of the deviation of these observations from dark matter predictions.

  17. Geons and the quantum information metric

    NASA Astrophysics Data System (ADS)

    Sinamuli, Musema; Mann, Robert B.

    2017-07-01

    We investigate the proposed duality between a quantum information metric in a CFTd +1 and the volume of a maximum time slice in the dual AdSd +2 for topological geons. Examining the specific cases of Banados-Teitelboim-Zannelli (BTZ) black holes and planar Schwarzschild-anti-de Sitter black holes, along with their geon counterparts, we find that the proposed duality relation for geons is the same apart from a factor of 4. The information metric therefore provides a probe of the topology of the bulk spacetime.

  18. Jacobi-Maupertuis metric and Kepler equation

    NASA Astrophysics Data System (ADS)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  19. Stability and integration over Bergman metrics

    NASA Astrophysics Data System (ADS)

    Klevtsov, Semyon; Zelditch, Steve

    2014-07-01

    We study partition functions of random Bergman metrics, with the actions defined by a class of geometric functionals known as `stability functions'. We introduce a new stability invariant — the critical value of the coupling constant — defined as the minimal coupling constant for which the partition function converges. It measures the minimal degree of stability of geodesic rays in the space the Bergman metrics, with respect to the action. We calculate this critical value when the action is the ν-balancing energy, and show that on a Riemann surface of genus h.

  20. Static spherical metrics: a geometrical approach

    NASA Astrophysics Data System (ADS)

    Tiwari, A. K.; Maharaj, S. D.; Narain, R.

    2017-08-01

    There exist several solution generating algorithms for static spherically symmetric metrics. Here we use the geometrical approach of Lie point symmetries to solve the condition of pressure isotropy by finding the associated five-dimensional Lie algebra of symmetry generators. For the non-Abelian subalgebras the underlying equation is solved to obtain a general solution. Contained within this class are vacuum models, constant density models, metrics with linear equations of state and the Buchdahl representation of the polytrope with index five. For a different particular symmetry generator the condition of pressure isotropy is transformed to a Riccati equation which admits particular solutions.