Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
An adhered-particle analysis system based on concave points
NASA Astrophysics Data System (ADS)
Wang, Wencheng; Guan, Fengnian; Feng, Lin
2018-04-01
Particles adhered together will influence the image analysis in computer vision system. In this paper, a method based on concave point is designed. First, corner detection algorithm is adopted to obtain a rough estimation of potential concave points after image segmentation. Then, it computes the area ratio of the candidates to accurately localize the final separation points. Finally, it uses the separation points of each particle and the neighboring pixels to estimate the original particles before adhesion and provides estimated profile images. The experimental results have shown that this approach can provide good results that match the human visual cognitive mechanism.
Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bhanu, Bir
1992-01-01
Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.
Automatic pole-like object modeling via 3D part-based analysis of point cloud
NASA Astrophysics Data System (ADS)
He, Liu; Yang, Haoxiang; Huang, Yuchun
2016-10-01
Pole-like objects, including trees, lampposts and traffic signs, are indispensable part of urban infrastructure. With the advance of vehicle-based laser scanning (VLS), massive point cloud of roadside urban areas becomes applied in 3D digital city modeling. Based on the property that different pole-like objects have various canopy parts and similar trunk parts, this paper proposed the 3D part-based shape analysis to robustly extract, identify and model the pole-like objects. The proposed method includes: 3D clustering and recognition of trunks, voxel growing and part-based 3D modeling. After preprocessing, the trunk center is identified as the point that has local density peak and the largest minimum inter-cluster distance. Starting from the trunk centers, the remaining points are iteratively clustered to the same centers of their nearest point with higher density. To eliminate the noisy points, cluster border is refined by trimming boundary outliers. Then, candidate trunks are extracted based on the clustering results in three orthogonal planes by shape analysis. Voxel growing obtains the completed pole-like objects regardless of overlaying. Finally, entire trunk, branch and crown part are analyzed to obtain seven feature parameters. These parameters are utilized to model three parts respectively and get signal part-assembled 3D model. The proposed method is tested using the VLS-based point cloud of Wuhan University, China. The point cloud includes many kinds of trees, lampposts and other pole-like posters under different occlusions and overlaying. Experimental results show that the proposed method can extract the exact attributes and model the roadside pole-like objects efficiently.
NASA Technical Reports Server (NTRS)
Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)
2004-01-01
A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).
Point-based and model-based geolocation analysis of airborne laser scanning data
NASA Astrophysics Data System (ADS)
Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet
2017-01-01
Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.
NASA Astrophysics Data System (ADS)
Kim, Namkug; Seo, Joon Beom; Heo, Jeong Nam; Kang, Suk-Ho
2007-03-01
The study was conducted to develop a simple model for more robust lung registration of volumetric CT data, which is essential for various clinical lung analysis applications, including the lung nodule matching in follow up CT studies, semi-quantitative assessment of lung perfusion, and etc. The purpose of this study is to find the most effective reference point and geometric model based on the lung motion analysis from the CT data sets obtained in full inspiration (In.) and expiration (Ex.). Ten pairs of CT data sets in normal subjects obtained in full In. and Ex. were used in this study. Two radiologists were requested to draw 20 points representing the subpleural point of the central axis in each segment. The apex, hilar point, and center of inertia (COI) of each unilateral lung were proposed as the reference point. To evaluate optimal expansion point, non-linear optimization without constraints was employed. The objective function is sum of distances from the line, consist of the corresponding points between In. and Ex. to the optimal point x. By using the nonlinear optimization, the optimal points was evaluated and compared between reference points. The average distance between the optimal point and each line segment revealed that the balloon model was more suitable to explain the lung expansion model. This lung motion analysis based on vector analysis and non-linear optimization shows that balloon model centered on the center of inertia of lung is most effective geometric model to explain lung expansion by breathing.
Congruence analysis of point clouds from unstable stereo image sequences
NASA Astrophysics Data System (ADS)
Jepping, C.; Bethmann, F.; Luhmann, T.
2014-06-01
This paper deals with the correction of exterior orientation parameters of stereo image sequences over deformed free-form surfaces without control points. Such imaging situation can occur, for example, during photogrammetric car crash test recordings where onboard high-speed stereo cameras are used to measure 3D surfaces. As a result of such measurements 3D point clouds of deformed surfaces are generated for a complete stereo sequence. The first objective of this research focusses on the development and investigation of methods for the detection of corresponding spatial and temporal tie points within the stereo image sequences (by stereo image matching and 3D point tracking) that are robust enough for a reliable handling of occlusions and other disturbances that may occur. The second objective of this research is the analysis of object deformations in order to detect stable areas (congruence analysis). For this purpose a RANSAC-based method for congruence analysis has been developed. This process is based on the sequential transformation of randomly selected point groups from one epoch to another by using a 3D similarity transformation. The paper gives a detailed description of the congruence analysis. The approach has been tested successfully on synthetic and real image data.
1990-01-01
Selection of Indicator Chemicals 6-36 6.2.2 Estimation of Exposure Point Concentrations or Emission Rates 6-38 6.2.2.1 Exposure Pathway Analysis 6-38...Exposure Point Concentrations or Emission Rates 6-50 j 6.3.2.1 Exposure Pathway Analysis 6-52 6.3.2.2 Exposure Point Concentrations 6-55 6.3.2.3...Exposure Point Concentrations or Emission Rates 6-62 6.4.2.1 Exposure Pathway Analysis 6-62 6.4.2.2 Exposure Point Concentrations 6-69 6.4.2.3
Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds
NASA Astrophysics Data System (ADS)
Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan
2017-06-01
Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.
Teacher training of secondary - Orient from the point of view practice
NASA Astrophysics Data System (ADS)
Hai, Tuong Duy; Huong, Nguyen Thanh
2018-01-01
The article presents the point of view about teacher training based on analysis of practices of teaching/learning in disciplinary in high school. Basing on analysis results of teaching faculty and the learning process of students in the disciplinary in high school to offer tags referred to the ongoing training of secondary teachers to adapt to educational.
A new method to identify the foot of continental slope based on an integrated profile analysis
NASA Astrophysics Data System (ADS)
Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin
2017-06-01
A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.
Change-point analysis data of neonatal diffusion tensor MRI in preterm and term-born infants.
Wu, Dan; Chang, Linda; Akazawa, Kentaro; Oishi, Kumiko; Skranes, Jon; Ernst, Thomas; Oishi, Kenichi
2017-06-01
The data presented in this article are related to the research article entitled "Mapping the Critical Gestational Age at Birth that Alters Brain Development in Preterm-born Infants using Multi-Modal MRI" (Wu et al., 2017) [1]. Brain immaturity at birth poses critical neurological risks in the preterm-born infants. We used a novel change-point model to analyze the critical gestational age at birth (GAB) that could affect postnatal development, based on diffusion tensor MRI (DTI) acquired from 43 preterm and 43 term-born infants in 126 brain regions. In the corresponding research article, we presented change-point analysis of fractional anisotropy (FA) and mean diffusivities (MD) measurements in these infants. In this article, we offered the relative changes of axonal and radial diffusivities (AD and RD) in relation to the change of FA and FA-based change-points, and we also provided the AD- and RD-based change-point results.
NASA Astrophysics Data System (ADS)
Yang, Hongxin; Su, Fulin
2018-01-01
We propose a moving target analysis algorithm using speeded-up robust features (SURF) and regular moment in inverse synthetic aperture radar (ISAR) image sequences. In our study, we first extract interest points from ISAR image sequences by SURF. Different from traditional feature point extraction methods, SURF-based feature points are invariant to scattering intensity, target rotation, and image size. Then, we employ a bilateral feature registering model to match these feature points. The feature registering scheme can not only search the isotropic feature points to link the image sequences but also reduce the error matching pairs. After that, the target centroid is detected by regular moment. Consequently, a cost function based on correlation coefficient is adopted to analyze the motion information. Experimental results based on simulated and real data validate the effectiveness and practicability of the proposed method.
NASA Astrophysics Data System (ADS)
Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.
2017-06-01
We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.
Site Selection and Resource Allocation of Oil Spill Emergency Base for Offshore Oil Facilities
NASA Astrophysics Data System (ADS)
Li, Yunbin; Liu, Jingxian; Wei, Lei; Wu, Weihuang
2018-02-01
Based on the analysis of the historical data about oil spill accidents in the Bohai Sea, this paper discretizes oil spilled source into a limited number of spill points. According to the probability of oil spill risk, the demand for salvage forces at each oil spill point is evaluated. Aiming at the specific location of the rescue base around the Bohai Sea, a cost-benefit analysis is conducted to determine the total cost of disasters for each rescue base. Based on the relationship between the oil spill point and the rescue site, a multi-objective optimization location model for the oil spill rescue base in the Bohai Sea region is established. And the genetic algorithm is used to solve the optimization problem, and determine the emergency rescue base optimization program and emergency resources allocation ratio.
Better Assessment Science Integrating Point and Non-point Sources (BASINS)
Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.
Business Planning in the Light of Neuro-fuzzy and Predictive Forecasting
NASA Astrophysics Data System (ADS)
Chakrabarti, Prasun; Basu, Jayanta Kumar; Kim, Tai-Hoon
In this paper we have pointed out gain sensing on forecast based techniques.We have cited an idea of neural based gain forecasting. Testing of sequence of gain pattern is also verifies using statsistical analysis of fuzzy value assignment. The paper also suggests realization of stable gain condition using K-Means clustering of data mining. A new concept of 3D based gain sensing has been pointed out. The paper also reveals what type of trend analysis can be observed for probabilistic gain prediction.
Pointing error analysis of Risley-prism-based beam steering system.
Zhou, Yuan; Lu, Yafei; Hei, Mo; Liu, Guangcan; Fan, Dapeng
2014-09-01
Based on the vector form Snell's law, ray tracing is performed to quantify the pointing errors of Risley-prism-based beam steering systems, induced by component errors, prism orientation errors, and assembly errors. Case examples are given to elucidate the pointing error distributions in the field of regard and evaluate the allowances of the error sources for a given pointing accuracy. It is found that the assembly errors of the second prism will result in more remarkable pointing errors in contrast with the first one. The pointing errors induced by prism tilt depend on the tilt direction. The allowances of bearing tilt and prism tilt are almost identical if the same pointing accuracy is planned. All conclusions can provide a theoretical foundation for practical works.
Kim, Hak-Jin; Kim, Bong Chul; Kim, Jin-Geun; Zhengguo, Piao; Kang, Sang Hoon; Lee, Sang-Hwy
2014-03-01
The objective of this study was to determine the reliable midsagittal (MS) reference plane in practical ways for the three-dimensional craniofacial analysis on three-dimensional computed tomography images. Five normal human dry skulls and 20 normal subjects without any dysmorphoses or asymmetries were used. The accuracies and stability on repeated plane construction for almost every possible candidate MS plane based on the skull base structures were examined by comparing the discrepancies in distances and orientations from the reference points and planes of the skull base and facial bones on three-dimensional computed tomography images. The following reference points of these planes were stable, and their distribution was balanced: nasion and foramen cecum at the anterior part of the skull base, sella at the middle part, and basion and opisthion at the posterior part. The candidate reference planes constructed using the aforementioned reference points were thought to be reliable for use as an MS reference plane for the three-dimensional analysis of maxillofacial dysmorphosis.
NASA Astrophysics Data System (ADS)
Fu, Rongxin; Li, Qi; Zhang, Junqi; Wang, Ruliang; Lin, Xue; Xue, Ning; Su, Ya; Jiang, Kai; Huang, Guoliang
2016-10-01
Label free point mutation detection is particularly momentous in the area of biomedical research and clinical diagnosis since gene mutations naturally occur and bring about highly fatal diseases. In this paper, a label free and high sensitive approach is proposed for point mutation detection based on hyperspectral interferometry. A hybridization strategy is designed to discriminate a single-base substitution with sequence-specific DNA ligase. Double-strand structures will take place only if added oligonucleotides are perfectly paired to the probe sequence. The proposed approach takes full use of the inherent conformation of double-strand DNA molecules on the substrate and a spectrum analysis method is established to point out the sub-nanoscale thickness variation, which benefits to high sensitive mutation detection. The limit of detection reach 4pg/mm2 according to the experimental result. A lung cancer gene point mutation was demonstrated, proving the high selectivity and multiplex analysis capability of the proposed biosensor.
Object recognition and localization from 3D point clouds by maximum-likelihood estimation
NASA Astrophysics Data System (ADS)
Dantanarayana, Harshana G.; Huntley, Jonathan M.
2017-08-01
We present an algorithm based on maximum-likelihood analysis for the automated recognition of objects, and estimation of their pose, from 3D point clouds. Surfaces segmented from depth images are used as the features, unlike `interest point'-based algorithms which normally discard such data. Compared to the 6D Hough transform, it has negligible memory requirements, and is computationally efficient compared to iterative closest point algorithms. The same method is applicable to both the initial recognition/pose estimation problem as well as subsequent pose refinement through appropriate choice of the dispersion of the probability density functions. This single unified approach therefore avoids the usual requirement for different algorithms for these two tasks. In addition to the theoretical description, a simple 2 degrees of freedom (d.f.) example is given, followed by a full 6 d.f. analysis of 3D point cloud data from a cluttered scene acquired by a projected fringe-based scanner, which demonstrated an RMS alignment error as low as 0.3 mm.
Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W
2001-01-01
To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.
Temporally-Constrained Group Sparse Learning for Longitudinal Data Analysis in Alzheimer’s Disease
Jie, Biao; Liu, Mingxia; Liu, Jun
2016-01-01
Sparse learning has been widely investigated for analysis of brain images to assist the diagnosis of Alzheimer’s disease (AD) and its prodromal stage, i.e., mild cognitive impairment (MCI). However, most existing sparse learning-based studies only adopt cross-sectional analysis methods, where the sparse model is learned using data from a single time-point. Actually, multiple time-points of data are often available in brain imaging applications, which can be used in some longitudinal analysis methods to better uncover the disease progression patterns. Accordingly, in this paper we propose a novel temporally-constrained group sparse learning method aiming for longitudinal analysis with multiple time-points of data. Specifically, we learn a sparse linear regression model by using the imaging data from multiple time-points, where a group regularization term is first employed to group the weights for the same brain region across different time-points together. Furthermore, to reflect the smooth changes between data derived from adjacent time-points, we incorporate two smoothness regularization terms into the objective function, i.e., one fused smoothness term which requires that the differences between two successive weight vectors from adjacent time-points should be small, and another output smoothness term which requires the differences between outputs of two successive models from adjacent time-points should also be small. We develop an efficient optimization algorithm to solve the proposed objective function. Experimental results on ADNI database demonstrate that, compared with conventional sparse learning-based methods, our proposed method can achieve improved regression performance and also help in discovering disease-related biomarkers. PMID:27093313
NASA Astrophysics Data System (ADS)
Lague, D.
2014-12-01
High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.
Fourier Spectroscopy: A Simple Analysis Technique
ERIC Educational Resources Information Center
Oelfke, William C.
1975-01-01
Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)
Mestdagh, Inge; Bonicelli, Bernard; Laplana, Ramon; Roettele, Manfred
2009-01-01
Based on the results and lessons learned from the TOPPS project (Training the Operators to prevent Pollution from Point Sources), a proposal on a sustainable strategy to avoid point source pollution from Plant Protection Products (PPPs) was made. Within this TOPPS project (2005-2008), stakeholders were interviewed and research and analysis were done in 6 pilot catchment areas (BE, FR, DE, DK, IT, PL). Next, there was a repeated survey on operators' perception and opinion to measure changes resulting from TOPPS activities and good and bad practices were defined based on the Best Management Practices (risk analysis). Aim of the proposal is to suggest a strategy considering the differences between countries which can be implemented on Member State level in order to avoid PPP pollution of water through point sources. The methodology used for the up-scaLing proposal consists of the analysis of the current situation, a gap analysis, a consistency analysis and organisational structures for implementation. The up-scaling proposal focuses on the behaviour of the operators, on the equipment and infrastructure available with the operators. The proposal defines implementation structures to support correct behaviour through the development and updating of Best Management Practices (BMPs) and through the transfer and the implementation of these BMPs. Next, the proposal also defines requirements for the improvement of equipment and infrastructure based on the defined key factors related to point source pollution. It also contains cost estimates for technical and infrastructure upgrades to comply with BMPs.
Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch
2006-10-01
Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
Extrapolation of Functions of Many Variables by Means of Metric Analysis
NASA Astrophysics Data System (ADS)
Kryanev, Alexandr; Ivanov, Victor; Romanova, Anastasiya; Sevastianov, Leonid; Udumyan, David
2018-02-01
The paper considers a problem of extrapolating functions of several variables. It is assumed that the values of the function of m variables at a finite number of points in some domain D of the m-dimensional space are given. It is required to restore the value of the function at points outside the domain D. The paper proposes a fundamentally new method for functions of several variables extrapolation. In the presented paper, the method of extrapolating a function of many variables developed by us uses the interpolation scheme of metric analysis. To solve the extrapolation problem, a scheme based on metric analysis methods is proposed. This scheme consists of two stages. In the first stage, using the metric analysis, the function is interpolated to the points of the domain D belonging to the segment of the straight line connecting the center of the domain D with the point M, in which it is necessary to restore the value of the function. In the second stage, based on the auto regression model and metric analysis, the function values are predicted along the above straight-line segment beyond the domain D up to the point M. The presented numerical example demonstrates the efficiency of the method under consideration.
NASA Astrophysics Data System (ADS)
Zhang, Hao; Yuan, Yan; Su, Lijuan; Huang, Fengzhen; Bai, Qing
2016-09-01
The Risley-prism-based light beam steering apparatus delivers superior pointing accuracy and it is used in imaging LIDAR and imaging microscopes. A general model for pointing error analysis of the Risley prisms is proposed in this paper, based on ray direction deviation in light refraction. This model captures incident beam deviation, assembly deflections, and prism rotational error. We derive the transmission matrixes of the model firstly. Then, the independent and cumulative effects of different errors are analyzed through this model. Accuracy study of the model shows that the prediction deviation of pointing error for different error is less than 4.1×10-5° when the error amplitude is 0.1°. Detailed analyses of errors indicate that different error sources affect the pointing accuracy to varying degree, and the major error source is the incident beam deviation. The prism tilting has a relative big effect on the pointing accuracy when prism tilts in the principal section. The cumulative effect analyses of multiple errors represent that the pointing error can be reduced by tuning the bearing tilting in the same direction. The cumulative effect of rotational error is relative big when the difference of these two prism rotational angles equals 0 or π, while it is relative small when the difference equals π/2. The novelty of these results suggests that our analysis can help to uncover the error distribution and aid in measurement calibration of Risley-prism systems.
Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition
NASA Astrophysics Data System (ADS)
Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso
2005-04-01
Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.
Registration algorithm of point clouds based on multiscale normal features
NASA Astrophysics Data System (ADS)
Lu, Jun; Peng, Zhongtao; Su, Hang; Xia, GuiHua
2015-01-01
The point cloud registration technology for obtaining a three-dimensional digital model is widely applied in many areas. To improve the accuracy and speed of point cloud registration, a registration method based on multiscale normal vectors is proposed. The proposed registration method mainly includes three parts: the selection of key points, the calculation of feature descriptors, and the determining and optimization of correspondences. First, key points are selected from the point cloud based on the changes of magnitude of multiscale curvatures obtained by using principal components analysis. Then the feature descriptor of each key point is proposed, which consists of 21 elements based on multiscale normal vectors and curvatures. The correspondences in a pair of two point clouds are determined according to the descriptor's similarity of key points in the source point cloud and target point cloud. Correspondences are optimized by using a random sampling consistency algorithm and clustering technology. Finally, singular value decomposition is applied to optimized correspondences so that the rigid transformation matrix between two point clouds is obtained. Experimental results show that the proposed point cloud registration algorithm has a faster calculation speed, higher registration accuracy, and better antinoise performance.
Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M
2009-06-01
This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.
NASA Astrophysics Data System (ADS)
Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.
2017-11-01
The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.
Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud
NASA Astrophysics Data System (ADS)
Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.
2018-04-01
In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.
Tipping point analysis of atmospheric oxygen concentration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.
2015-03-15
We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.
NASA Astrophysics Data System (ADS)
Nemoto, Mitsutaka; Nomura, Yukihiro; Hanaoka, Shohei; Masutani, Yoshitaka; Yoshikawa, Takeharu; Hayashi, Naoto; Yoshioka, Naoki; Ohtomo, Kuni
Anatomical point landmarks as most primitive anatomical knowledge are useful for medical image understanding. In this study, we propose a detection method for anatomical point landmark based on appearance models, which include gray-level statistical variations at point landmarks and their surrounding area. The models are built based on results of Principal Component Analysis (PCA) of sample data sets. In addition, we employed generative learning method by transforming ROI of sample data. In this study, we evaluated our method with 24 data sets of body trunk CT images and obtained 95.8 ± 7.3 % of the average sensitivity in 28 landmarks.
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-01-01
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121
Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.
Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu
2016-12-24
A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
NASA Astrophysics Data System (ADS)
Rueda, Sylvia; Udupa, Jayaram K.
2011-03-01
Landmark based statistical object modeling techniques, such as Active Shape Model (ASM), have proven useful in medical image analysis. Identification of the same homologous set of points in a training set of object shapes is the most crucial step in ASM, which has encountered challenges such as (C1) defining and characterizing landmarks; (C2) ensuring homology; (C3) generalizing to n > 2 dimensions; (C4) achieving practical computations. In this paper, we propose a novel global-to-local strategy that attempts to address C3 and C4 directly and works in Rn. The 2D version starts from two initial corresponding points determined in all training shapes via a method α, and subsequently by subdividing the shapes into connected boundary segments by a line determined by these points. A shape analysis method β is applied on each segment to determine a landmark on the segment. This point introduces more pairs of points, the lines defined by which are used to further subdivide the boundary segments. This recursive boundary subdivision (RBS) process continues simultaneously on all training shapes, maintaining synchrony of the level of recursion, and thereby keeping correspondence among generated points automatically by the correspondence of the homologous shape segments in all training shapes. The process terminates when no subdividing lines are left to be considered that indicate (as per method β) that a point can be selected on the associated segment. Examples of α and β are presented based on (a) distance; (b) Principal Component Analysis (PCA); and (c) the novel concept of virtual landmarks.
Zero-moment point determination of worst-case manoeuvres leading to vehicle wheel lift
NASA Astrophysics Data System (ADS)
Lapapong, S.; Brown, A. A.; Swanson, K. S.; Brennan, S. N.
2012-01-01
This paper proposes a method to evaluate vehicle rollover propensity based on a frequency-domain representation of the zero-moment point (ZMP). Unlike other rollover metrics such as the static stability factor, which is based on the steady-state behaviour, and the load transfer ratio, which requires the calculation of tyre forces, the ZMP is based on a simplified kinematic model of the vehicle and the analysis of the contact point of the vehicle relative to the edge of the support polygon. Previous work has validated the use of the ZMP experimentally in its ability to predict wheel lift in the time domain. This work explores the use of the ZMP in the frequency domain to allow a chassis designer to understand how operating conditions and vehicle parameters affect rollover propensity. The ZMP analysis is then extended to calculate worst-case sinusoidal manoeuvres that lead to untripped wheel lift, and the analysis is tested across several vehicle configurations and compared with that of the standard Toyota J manoeuvre.
Sampled control stability of the ESA instrument pointing system
NASA Astrophysics Data System (ADS)
Thieme, G.; Rogers, P.; Sciacovelli, D.
Stability analysis and simulation results are presented for the ESA Instrument Pointing System (IPS) that is to be used in Spacelab's second launch. Of the two IPS plant dynamic models used in the ESA and NASA activities, one is based on six interconnected rigid bodies that represent the IPS and plant dynamic models used in the ESA and NASA activities, one is based on six interconnected rigid bodies that represent the IPS and its payload, while the other follows the NASA practice of defining an IPS-Spacelab 2 plant configuration through a structural finite element model, which is then used to generate modal data for various pointing directions. In both cases, the IPS dynamic plant model is truncated, then discretized at the sampling frequency and interfaces to a PID-based control law. A stability analysis has been carried out in discrete domain for various instrument pointing directions, taking into account suitable parameter variation ranges. A number of time simulations are presented.
Point clouds segmentation as base for as-built BIM creation
NASA Astrophysics Data System (ADS)
Macher, H.; Landes, T.; Grussenmeyer, P.
2015-08-01
In this paper, a three steps segmentation approach is proposed in order to create 3D models from point clouds acquired by TLS inside buildings. The three scales of segmentation are floors, rooms and planes composing the rooms. First, floor segmentation is performed based on analysis of point distribution along Z axis. Then, for each floor, room segmentation is achieved considering a slice of point cloud at ceiling level. Finally, planes are segmented for each room, and planes corresponding to ceilings and floors are identified. Results of each step are analysed and potential improvements are proposed. Based on segmented point clouds, the creation of as-built BIM is considered in a future work section. Not only the classification of planes into several categories is proposed, but the potential use of point clouds acquired outside buildings is also considered.
[Analysis and research on cleaning points of HVAC systems in public places].
Yang, Jiaolan; Han, Xu; Chen, Dongqing; Jin, Xin; Dai, Zizhu
2010-03-01
To analyze cleaning points of HVAC systems, and to provides scientific base for regulating the cleaning of HVAC systems. Based on the survey results on the cleaning situation of HVAC systems around China for the past three years, we analyzes the cleaning points of HVAC systems from various aspects, such as the major health risk factors of HVAC systems, the formulation strategy of the cleaning of HVAC systems, cleaning methods and acceptance points of the air ducts and the parts of HVAC systems, the onsite protection and individual protection, the waste treatment and the cleaning of the removed equipment, inspection of the cleaning results, video record, and the final acceptance of the cleaning. The analysis of the major health risk factors of HVAC systems and the formulation strategy of the cleaning of HVAC systems is given. The specific methods for cleaning the air ducts, machine units, air ports, coil pipes and the water cooling towers of HVAC systems, the acceptance points of HVAC systems and the requirements of the report on the final acceptance of the cleaning of HVAC systems are proposed. By the analysis of the points of the cleaning of HVAC systems and proposal of corresponding measures, this study provides the base for the scientific and regular launch of the cleaning of HVAC systems, a novel technology service, and lays a foundation for the revision of the existing cleaning regulations, which may generate technical and social benefits to some extent.
Mark Spencer; Kevin O' Hara
2006-01-01
Phytophthora ramorum is a major source of tanoak (Lithocarpus densiflorus) mortality in the tanoak/redwood (Sequoia sempervirens) forests of central California. This study presents a spatial analysis of the spread of the disease using second-order point pattern and GIS analyses. Our data set includes four plots...
A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations
Guo, Yi; Parsons, Tyler; Dykes, Katherine; ...
2016-08-24
This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less
A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Parsons, Tyler; Dykes, Katherine
This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.
2017-06-01
Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.
NASA Technical Reports Server (NTRS)
Hung, J. C.
1980-01-01
The pointing control of a microwave antenna of the Satellite Power System was investigated emphasizing: (1) the SPS antenna pointing error sensing method; (2) a rigid body pointing control design; and (3) approaches for modeling the flexible body characteristics of the solar collector. Accuracy requirements for the antenna pointing control consist of a mechanical pointing control accuracy of three arc-minutes and an electronic phased array pointing accuracy of three arc-seconds. Results based on the factors considered in current analysis, show that the three arc-minute overall pointing control accuracy can be achieved in practice.
van Haaften, Rachel I M; Luceri, Cristina; van Erk, Arie; Evelo, Chris T A
2009-06-01
Omics technology used for large-scale measurements of gene expression is rapidly evolving. This work pointed out the need of an extensive bioinformatics analyses for array quality assessment before and after gene expression clustering and pathway analysis. A study focused on the effect of red wine polyphenols on rat colon mucosa was used to test the impact of quality control and normalisation steps on the biological conclusions. The integration of data visualization, pathway analysis and clustering revealed an artifact problem that was solved with an adapted normalisation. We propose a possible point to point standard analysis procedure, based on a combination of clustering and data visualization for the analysis of microarray data.
Moranda, Arianna
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities. PMID:29270328
Paladino, Ombretta; Moranda, Arianna; Seyedsalehi, Mahdi
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities.
Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis
NASA Astrophysics Data System (ADS)
Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song
2018-01-01
To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.
Zamunér, Antonio R.; Catai, Aparecida M.; Martins, Luiz E. B.; Sakabe, Daniel I.; Silva, Ester Da
2013-01-01
Background The second heart rate (HR) turn point has been extensively studied, however there are few studies determining the first HR turn point. Also, the use of mathematical and statistical models for determining changes in dynamic characteristics of physiological variables during an incremental cardiopulmonary test has been suggested. Objectives To determine the first turn point by analysis of HR, surface electromyography (sEMG), and carbon dioxide output () using two mathematical models and to compare the results to those of the visual method. Method Ten sedentary middle-aged men (53.9±3.2 years old) were submitted to cardiopulmonary exercise testing on an electromagnetic cycle ergometer until exhaustion. Ventilatory variables, HR, and sEMG of the vastus lateralis were obtained in real time. Three methods were used to determine the first turn point: 1) visual analysis based on loss of parallelism between and oxygen uptake (); 2) the linear-linear model, based on fitting the curves to the set of data (Lin-Lin ); 3) a bi-segmental linear regression of Hinkley' s algorithm applied to HR (HMM-HR), (HMM- ), and sEMG data (HMM-RMS). Results There were no differences between workload, HR, and ventilatory variable values at the first ventilatory turn point as determined by the five studied parameters (p>0.05). The Bland-Altman plot showed an even distribution of the visual analysis method with Lin-Lin , HMM-HR, HMM-CO2, and HMM-RMS. Conclusion The proposed mathematical models were effective in determining the first turn point since they detected the linear pattern change and the deflection point of , HR responses, and sEMG. PMID:24346296
Zamunér, Antonio R; Catai, Aparecida M; Martins, Luiz E B; Sakabe, Daniel I; Da Silva, Ester
2013-01-01
The second heart rate (HR) turn point has been extensively studied, however there are few studies determining the first HR turn point. Also, the use of mathematical and statistical models for determining changes in dynamic characteristics of physiological variables during an incremental cardiopulmonary test has been suggested. To determine the first turn point by analysis of HR, surface electromyography (sEMG), and carbon dioxide output (VCO2) using two mathematical models and to compare the results to those of the visual method. Ten sedentary middle-aged men (53.9 ± 3.2 years old) were submitted to cardiopulmonary exercise testing on an electromagnetic cycle ergometer until exhaustion. Ventilatory variables, HR, and sEMG of the vastus lateralis were obtained in real time. Three methods were used to determine the first turn point: 1) visual analysis based on loss of parallelism between VCO2 and oxygen uptake (VO2); 2) the linear-linear model, based on fitting the curves to the set of VCO2 data (Lin-LinVCO2); 3) a bi-segmental linear regression of Hinkley's algorithm applied to HR (HMM-HR), VCO2 (HMM-VCO2), and sEMG data (HMM-RMS). There were no differences between workload, HR, and ventilatory variable values at the first ventilatory turn point as determined by the five studied parameters (p>0.05). The Bland-Altman plot showed an even distribution of the visual analysis method with Lin-LinVCO2, HMM-HR, HMM-VCO2, and HMM-RMS. The proposed mathematical models were effective in determining the first turn point since they detected the linear pattern change and the deflection point of VCO2, HR responses, and sEMG.
Dynamic analysis of suspension cable based on vector form intrinsic finite element method
NASA Astrophysics Data System (ADS)
Qin, Jian; Qiao, Liang; Wan, Jiancheng; Jiang, Ming; Xia, Yongjun
2017-10-01
A vector finite element method is presented for the dynamic analysis of cable structures based on the vector form intrinsic finite element (VFIFE) and mechanical properties of suspension cable. Firstly, the suspension cable is discretized into different elements by space points, the mass and external forces of suspension cable are transformed into space points. The structural form of cable is described by the space points at different time. The equations of motion for the space points are established according to the Newton’s second law. Then, the element internal forces between the space points are derived from the flexible truss structure. Finally, the motion equations of space points are solved by the central difference method with reasonable time integration step. The tangential tension of the bearing rope in a test ropeway with the moving concentrated loads is calculated and compared with the experimental data. The results show that the tangential tension of suspension cable with moving loads is consistent with the experimental data. This method has high calculated precision and meets the requirements of engineering application.
Modeling and analysis of pinhole occulter experiment
NASA Technical Reports Server (NTRS)
Ring, J. R.
1986-01-01
The objectives were to improve pointing control system implementation by converting the dynamic compensator from a continuous domain representation to a discrete one; to determine pointing stability sensitivites to sensor and actuator errors by adding sensor and actuator error models to treetops and by developing an error budget for meeting pointing stability requirements; and to determine pointing performance for alternate mounting bases (space station for example).
NASA Astrophysics Data System (ADS)
Sun, Z.; Xu, Y.; Hoegner, L.; Stilla, U.
2018-05-01
In this work, we propose a classification method designed for the labeling of MLS point clouds, with detrended geometric features extracted from the points of the supervoxel-based local context. To achieve the analysis of complex 3D urban scenes, acquired points of the scene should be tagged with individual labels of different classes. Thus, assigning a unique label to the points of an object that belong to the same category plays an essential role in the entire 3D scene analysis workflow. Although plenty of studies in this field have been reported, this work is still a challenging task. Specifically, in this work: 1) A novel geometric feature extraction method, detrending the redundant and in-salient information in the local context, is proposed, which is proved to be effective for extracting local geometric features from the 3D scene. 2) Instead of using individual point as basic element, the supervoxel-based local context is designed to encapsulate geometric characteristics of points, providing a flexible and robust solution for feature extraction. 3) Experiments using complex urban scene with manually labeled ground truth are conducted, and the performance of proposed method with respect to different methods is analyzed. With the testing dataset, we have obtained a result of 0.92 for overall accuracy for assigning eight semantic classes.
Meier, Benjamin Mason; Hodge, James G; Gebbie, Kristine M
2009-03-01
Given the public health importance of law modernization, we undertook a comparative analysis of policy efforts in 4 states (Alaska, South Carolina, Wisconsin, and Nebraska) that have considered public health law reform based on the Turning Point Model State Public Health Act. Through national legislative tracking and state case studies, we investigated how the Turning Point Act's model legal language has been considered for incorporation into state law and analyzed key facilitating and inhibiting factors for public health law reform. Our findings provide the practice community with a research base to facilitate further law reform and inform future scholarship on the role of law as a determinant of the public's health.
Investigating the Accuracy of Point Clouds Generated for Rock Surfaces
NASA Astrophysics Data System (ADS)
Seker, D. Z.; Incekara, A. H.
2016-12-01
Point clouds which are produced by means of different techniques are widely used to model the rocks and obtain the properties of rock surfaces like roughness, volume and area. These point clouds can be generated by applying laser scanning and close range photogrammetry techniques. Laser scanning is the most common method to produce point cloud. In this method, laser scanner device produces 3D point cloud at regular intervals. In close range photogrammetry, point cloud can be produced with the help of photographs taken in appropriate conditions depending on developing hardware and software technology. Many photogrammetric software which is open source or not currently provide the generation of point cloud support. Both methods are close to each other in terms of accuracy. Sufficient accuracy in the mm and cm range can be obtained with the help of a qualified digital camera and laser scanner. In both methods, field work is completed in less time than conventional techniques. In close range photogrammetry, any part of rock surfaces can be completely represented owing to overlapping oblique photographs. In contrast to the proximity of the data, these two methods are quite different in terms of cost. In this study, whether or not point cloud produced by photographs can be used instead of point cloud produced by laser scanner device is investigated. In accordance with this purpose, rock surfaces which have complex and irregular shape located in İstanbul Technical University Ayazaga Campus were selected as study object. Selected object is mixture of different rock types and consists of both partly weathered and fresh parts. Study was performed on a part of 30m x 10m rock surface. 2D and 3D analysis were performed for several regions selected from the point clouds of the surface models. 2D analysis is area-based and 3D analysis is volume-based. Analysis conclusions showed that point clouds in both are similar and can be used as alternative to each other. This proved that point cloud produced using photographs which are both economical and enables to produce data in less time can be used in several studies instead of point cloud produced by laser scanner.
Lardeux, Frédéric; Torrico, Gino; Aliaga, Claudia
2016-07-04
In ELISAs, sera of individuals infected by Trypanosoma cruzi show absorbance values above a cut-off value. The cut-off is generally computed by means of formulas that need absorbance readings of negative (and sometimes positive) controls, which are included in the titer plates amongst the unknown samples. When no controls are available, other techniques should be employed such as change-point analysis. The method was applied to Bolivian dog sera processed by ELISA to diagnose T. cruzi infection. In each titer plate, the change-point analysis estimated a step point which correctly discriminated among known positive and known negative sera, unlike some of the six usual cut-off formulas tested. To analyse the ELISAs results, the change-point method was as good as the usual cut-off formula of the form "mean + 3 standard deviation of negative controls". Change-point analysis is therefore an efficient alternative method to analyse ELISA absorbance values when no controls are available.
Identification and Analysis of National Airspace System Resource Constraints
NASA Technical Reports Server (NTRS)
Smith, Jeremy C.; Marien, Ty V.; Viken, Jeffery K.; Neitzke, Kurt W.; Kwa, Tech-Seng; Dollyhigh, Samuel M.; Fenbert, James W.; Hinze, Nicolas K.
2015-01-01
This analysis is the deliverable for the Airspace Systems Program, Systems Analysis Integration and Evaluation Project Milestone for the Systems and Portfolio Analysis (SPA) focus area SPA.4.06 Identification and Analysis of National Airspace System (NAS) Resource Constraints and Mitigation Strategies. "Identify choke points in the current and future NAS. Choke points refer to any areas in the en route, terminal, oceanic, airport, and surface operations that constrain actual demand in current and projected future operations. Use the Common Scenarios based on Transportation Systems Analysis Model (TSAM) projections of future demand developed under SPA.4.04 Tools, Methods and Scenarios Development. Analyze causes, including operational and physical constraints." The NASA analysis is complementary to a NASA Research Announcement (NRA) "Development of Tools and Analysis to Evaluate Choke Points in the National Airspace System" Contract # NNA3AB95C awarded to Logistics Management Institute, Sept 2013.
2015-09-01
the network Mac8 Medium Access Control ( Mac ) (Ethernet) address observed as destination for outgoing packets subsessionid8 Zero-based index of...15. SUBJECT TERMS tactical networks, data reduction, high-performance computing, data analysis, big data 16. SECURITY CLASSIFICATION OF: 17...Integer index of row cts_deid Device (instrument) Identifier where observation took place cts_collpt Collection point or logical observation point on
Some analysis on the diurnal variation of rainfall over the Atlantic Ocean
NASA Technical Reports Server (NTRS)
Gill, T.; Perng, S.; Hughes, A.
1981-01-01
Data collected from the GARP Atlantic Tropical Experiment (GATE) was examined. The data were collected from 10,000 grid points arranged as a 100 x 100 array; each grid covered a 4 square km area. The amount of rainfall was measured every 15 minutes during the experiment periods using c-band radars. Two types of analyses were performed on the data: analysis of diurnal variation was done on each of grid points based on the rainfall averages at noon and at midnight, and time series analysis on selected grid points based on the hourly averages of rainfall. Since there are no known distribution model which best describes the rainfall amount, nonparametric methods were used to examine the diurnal variation. Kolmogorov-Smirnov test was used to test if the rainfalls at noon and at midnight have the same statistical distribution. Wilcoxon signed-rank test was used to test if the noon rainfall is heavier than, equal to, or lighter than the midnight rainfall. These tests were done on each of the 10,000 grid points at which the data are available.
Thin-plate spline analysis of the cranial base in subjects with Class III malocclusion.
Singh, G D; McNamara, J A; Lozanoff, S
1997-08-01
The role of the cranial base in the emergence of Class III malocclusion is not fully understood. This study determines deformations that contribute to a Class III cranial base morphology, employing thin-plate spline analysis on lateral cephalographs. A total of 73 children of European-American descent aged between 5 and 11 years of age with Class III malocclusion were compared with an equivalent group of subjects with a normal, untreated, Class I molar occlusion. The cephalographs were traced, checked and subdivided into seven age- and sex-matched groups. Thirteen points on the cranial base were identified and digitized. The datasets were scaled to an equivalent size, and statistical analysis indicated significant differences between average Class I and Class III cranial base morphologies for each group. Thin-plate spline analysis indicated that both affine (uniform) and non-affine transformations contribute toward the total spline for each average cranial base morphology at each age group analysed. For non-affine transformations, Partial warps 10, 8 and 7 had high magnitudes, indicating large-scale deformations affecting Bolton point, basion, pterygo-maxillare, Ricketts' point and articulare. In contrast, high eigenvalues associated with Partial warps 1-3, indicating localized shape changes, were found at tuberculum sellae, sella, and the frontonasomaxillary suture. It is concluded that large spatial-scale deformations affect the occipital complex of the cranial base and sphenoidal region, in combination with localized distortions at the frontonasal suture. These deformations may contribute to reduced orthocephalization or deficient flattening of the cranial base antero-posteriorly that, in turn, leads to the formation of a Class III malocclusion.
NASA Astrophysics Data System (ADS)
Yun, Wanying; Lu, Zhenzhou; Jiang, Xian
2018-06-01
To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.
Proposals for Standardizing and Improving the Policy of Adding Points on the Entrance Exam
ERIC Educational Resources Information Center
Yuhong, Deng
2013-01-01
This article reviews policies for adding points on the College Entrance Examination. It analyzes the rationales and specific implementation strategies of various policies for adding points on the entrance exam, as well as their advantages and pitfalls. Based on these observations and analysis, the author also offers policy recommendations on the…
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Fuel freeze-point investigations. Final report, September 1982-March 1984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desmarais, L.A.; Tolle, F.F.
1984-07-01
The objective of this program was to conduct a detailed assessment of the low-temperature environment to which USAF aircraft are exposed for the purpose of defining a maximum acceptable fuel freeze-point and also to define any operational changes required with the use of a high freeze-point fuel. A previous study of B-52, C-141, and KC-135 operational missions indicated that the -58 C freeze point specification was too conservative. Based on recommendations resulting from the previous program, several improvements in the method of analysis were made, such as: expansion of the atmospheric temperature data base, the addition of ground temperature analysis,more » the addition of fuel-freezing analysis to the one-dimensional fuel-temperature computer program, and the examination of heat transfer in external fuel tanks, such as pylon or tip tanks. The B-52, C-141, and KC-135 mission were analyzed again, along with the operational missions of two tactical airplanes, the A-10 and F-15; -50C was determined to be the maximum allowable freeze point for a general-purpose USAF aviation turbine fuel. Higher freeze points can be tolerated if the probability of operational interference is acceptably low or if operational changes can be made. Study of atmospheric temperatures encountered for the missions of the five-study aircraft indicates that a maximum freeze point of -48 C would not likely create any operational difficulties in Northern Europe.« less
Wang, Yunsheng; Weinacker, Holger; Koch, Barbara
2008-01-01
A procedure for both vertical canopy structure analysis and 3D single tree modelling based on Lidar point cloud is presented in this paper. The whole area of research is segmented into small study cells by a raster net. For each cell, a normalized point cloud whose point heights represent the absolute heights of the ground objects is generated from the original Lidar raw point cloud. The main tree canopy layers and the height ranges of the layers are detected according to a statistical analysis of the height distribution probability of the normalized raw points. For the 3D modelling of individual trees, individual trees are detected and delineated not only from the top canopy layer but also from the sub canopy layer. The normalized points are resampled into a local voxel space. A series of horizontal 2D projection images at the different height levels are then generated respect to the voxel space. Tree crown regions are detected from the projection images. Individual trees are then extracted by means of a pre-order forest traversal process through all the tree crown regions at the different height levels. Finally, 3D tree crown models of the extracted individual trees are reconstructed. With further analyses on the 3D models of individual tree crowns, important parameters such as crown height range, crown volume and crown contours at the different height levels can be derived. PMID:27879916
Reachability Analysis for Base Placement in Mobile Manipulators
NASA Technical Reports Server (NTRS)
Seraji, H.
1994-01-01
This paper addresses the problem of base placement for mobile robots, and proposes a simple off-line solution to determine the appropriate base locations from which the robot can reach a target point.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stastna, A., E-mail: astastna@gmail.com; Sachlova, S.; Pertold, Z.
2012-03-15
Various microscopic techniques (cathodoluminescence, polarizing and electron microscopy) were combined with image analysis with the aim to determine a) the modal composition and degradation features within concrete, and b) the petrographic characteristics and the geological types (rocks, and their provenance) of the aggregates. Concrete samples were taken from five different portions of Highway Nos. D1, D11, and D5 (the Czech Republic). Coarse and fine aggregates were found to be primarily composed of volcanic, plutonic, metamorphic and sedimentary rocks, as well as of quartz and feldspar aggregates of variable origins. The alkali-silica reaction was observed to be the main degradation mechanism,more » based upon the presence of microcracks and alkali-silica gels in the concrete. Use of cathodoluminescence enabled the identification of the source materials of the quartz aggregates, based upon their CL characteristics (i.e., color, intensity, microfractures, deformation, and zoning), which is difficult to distinguish only employing polarizing and electron microscopy. - Highlights: Black-Right-Pointing-Pointer ASR in concrete pavements on the Highways Nos. D1, D5 and D11 (Czech Republic). Black-Right-Pointing-Pointer Cathodoluminescence was combined with various microscopic techniques and image analysis. Black-Right-Pointing-Pointer ASR was attributed to aggregates. Black-Right-Pointing-Pointer Source materials of aggregates were identified based on cathodoluminescence characteristics. Black-Right-Pointing-Pointer Quartz comes from different volcanic, plutonic and metamorphic parent rocks.« less
Meier, Benjamin Mason; Gebbie, Kristine M.
2009-01-01
Given the public health importance of law modernization, we undertook a comparative analysis of policy efforts in 4 states (Alaska, South Carolina, Wisconsin, and Nebraska) that have considered public health law reform based on the Turning Point Model State Public Health Act. Through national legislative tracking and state case studies, we investigated how the Turning Point Act's model legal language has been considered for incorporation into state law and analyzed key facilitating and inhibiting factors for public health law reform. Our findings provide the practice community with a research base to facilitate further law reform and inform future scholarship on the role of law as a determinant of the public's health. PMID:19150900
Phoebe L. Zarnetske; Thomas C., Jr. Edwards; Gretchen G. Moisen
2007-01-01
Estimating species likelihood of occurrence across extensive landscapes is a powerful management tool. Unfortunately, available occurrence data for landscape-scale modeling is often lacking and usually only in the form of observed presences. Ecologically based pseudo-absence points were generated from within habitat envelopes to accompany presence-only data in habitat...
ERIC Educational Resources Information Center
Shimizu, Hirofumi; Yoon, Soyoung; McDonough, Christopher S.
2010-01-01
We taught seven preschoolers with developmental disabilities to point-and-click with a computer mouse. The computer-based training program consisted of three parts, based on a task analysis of the behavioral prerequisites to point-and-click. Training 1 was designed to shape moving the mouse. Training 2 was designed to build eye-hand coordination…
NASA Astrophysics Data System (ADS)
Micheletti, Natan; Tonini, Marj; Lane, Stuart N.
2017-02-01
Acquisition of high density point clouds using terrestrial laser scanners (TLSs) has become commonplace in geomorphic science. The derived point clouds are often interpolated onto regular grids and the grids compared to detect change (i.e. erosion and deposition/advancement movements). This procedure is necessary for some applications (e.g. digital terrain analysis), but it inevitably leads to a certain loss of potentially valuable information contained within the point clouds. In the present study, an alternative methodology for geomorphological analysis and feature detection from point clouds is proposed. It rests on the use of the Density-Based Spatial Clustering of Applications with Noise (DBSCAN), applied to TLS data for a rock glacier front slope in the Swiss Alps. The proposed methods allowed the detection and isolation of movements directly from point clouds which yield to accuracies in the following computation of volumes that depend only on the actual registered distance between points. We demonstrated that these values are more conservative than volumes computed with the traditional DEM comparison. The results are illustrated for the summer of 2015, a season of enhanced geomorphic activity associated with exceptionally high temperatures.
Adhikari, S; Biswas, A; Bandyopadhyay, T K; Ghosh, P D
2014-06-01
Pointed gourd (Trichosanthes dioica Roxb.) is an economically important cucurbit and is extensively propagated through vegetative means, viz vine and root cuttings. As the accessions are poorly characterized it is important at the beginning of a breeding programme to discriminate among available genotypes to establish the level of genetic diversity. The genetic diversity of 10 pointed gourd races, referred to as accessions was evaluated. DNA profiling was generated using 10 sequence independent RAPD markers. A total of 58 scorable loci were observed out of which 18 (31.03%) loci were considered polymorphic. Genetic diversity parameters [average and effective number of alleles, Shannon's index, percent polymorphism, Nei's gene diversity, polymorphic information content (PIC)] for RAPD along with UPGMA clustering based on Jaccard's coefficient were estimated. The UPGMA dendogram constructed based on RAPD analysis in 10 pointed gourd accessions were found to be grouped in a single cluster and may represent members of one heterotic group. RAPD analysis showed promise as an effective tool in estimating genetic polymorphism in different accessions of pointed gourd.
Research study on stabilization and control: Modern sampled data control theory
NASA Technical Reports Server (NTRS)
Kuo, B. C.; Singh, G.; Yackel, R. A.
1973-01-01
A numerical analysis of spacecraft stability parameters was conducted. The analysis is based on a digital approximation by point by point state comparison. The technique used is that of approximating a continuous data system by a sampled data model by comparison of the states of the two systems. Application of the method to the digital redesign of the simplified one axis dynamics of the Skylab is presented.
Advancing School-Based Interventions through Economic Analysis
ERIC Educational Resources Information Center
Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne
2014-01-01
Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…
Capacity Estimation Model for Signalized Intersections under the Impact of Access Point
Zhao, Jing; Li, Peng; Zhou, Xizhao
2016-01-01
Highway Capacity Manual 2010 provides various factors to adjust the base saturation flow rate for the capacity analysis of signalized intersections. No factors, however, is considered for the potential change of signalized intersections capacity caused by the access point closeing to the signalized intersection. This paper presented a theoretical model to estimate the lane group capacity at signalized intersections with the consideration of the effects of access points. Two scenarios of access point locations, upstream or downstream of the signalized intersection, and impacts of six types of access traffic flow are taken into account. The proposed capacity model was validated based on VISSIM simulation. Results of extensive numerical analysis reveal the substantial impact of access point on the capacity, which has an inverse correlation with both the number of major street lanes and the distance between the intersection and access point. Moreover, among the six types of access traffic flows, the access traffic flow 1 (right-turning traffic from major street), flow 4 (left-turning traffic from access point), and flow 5 (left-turning traffic from major street) cause a more significant effect on lane group capacity than others. Some guidance on the mitigation of the negative effect is provided for practitioners. PMID:26726998
Development and Demonstration of an Ada Test Generation System
NASA Technical Reports Server (NTRS)
1996-01-01
In this project we have built a prototype system that performs Feasible Path Analysis on Ada programs: given a description of a set of control flow paths through a procedure, and a predicate at a program point feasible path analysis determines if there is input data which causes execution to flow down some path in the collection reaching the point so that tile predicate is true. Feasible path analysis can be applied to program testing, program slicing, array bounds checking, and other forms of anomaly checking. FPA is central to most applications of program analysis. But, because this problem is formally unsolvable, syntactic-based approximations are used in its place. For example, in dead-code analysis the problem is to determine if there are any input values which cause execution to reach a specified program point. Instead an approximation to this problem is computed: determine whether there is a control flow path from the start of the program to the point. This syntactic approximation is efficiently computable and conservative: if there is no such path the program point is clearly unreachable, but if there is such a path, the analysis is inconclusive, and the code is assumed to be live. Such conservative analysis too often yields unsatisfactory results because the approximation is too weak. As another example, consider data flow analysis. A du-pair is a pair of program points such that the first point is a definition of a variable and the second point a use and for which there exists a definition-free path from the definition to the use. The sharper, semantic definition of a du-pair requires that there be a feasible definition-free path from the definition to the use. A compiler using du-pairs for detecting dead variables may miss optimizations by not considering feasibility. Similarly, a program analyzer computing program slices to merge parallel versions may report conflicts where none exist. In the context of software testing, feasibility analysis plays an important role in identifying testing requirements which are infeasible. This is especially true for data flow testing and modified condition/decision coverage. Our system uses in an essential way symbolic analysis and theorem proving technology, and we believe this work represents one of the few successful uses of a theorem prover working in a completely automatic fashion to solve a problem of practical interest. We believe this work anticipates an important trend away from purely syntactic-based methods for program analysis to semantic methods based on symbolic processing and inference technology. Other results demonstrating the practical use of automatic inference is being reported in hardware verification, although there are significant differences between the hardware work and ours. However, what is common and important is that general purpose theorem provers are being integrated with more special-purpose decision procedures to solve problems in analysis and verification. We are pursuina commercial opportunities for this work, and will use and extend the work in other projects we are engaged in. Ultimately we would like to rework the system to analyze C, C++, or Java as a key step toward commercialization.
New clinical insights for transiently evoked otoacoustic emission protocols.
Hatzopoulos, Stavros; Grzanka, Antoni; Martini, Alessandro; Konopka, Wieslaw
2009-08-01
The objective of the study was to optimize the area of a time-frequency analysis and then investigate any stable patterns in the time-frequency structure of otoacoustic emissions in a population of 152 healthy adults sampled over one year. TEOAE recordings were collected from 302 ears in subjects presenting normal hearing and normal impedance values. The responses were analyzed by the Wigner-Ville distribution (WVD). The TF region of analysis was optimized by examining the energy content of various rectangular and triangular TF regions. The TEOAE components from the initial and recordings 12 months later were compared in the optimized TF region. The best region for TF analysis was identified with base point 1 at 2.24 ms and 2466 Hz, base point 2 at 6.72 ms and 2466 Hz, and the top point at 2.24 ms and 5250 Hz. Correlation indices from the TF optimized region were higher, and were statistically significant, than the traditional indices in the selected time window. An analysis of the TF data within a 12-month period indicated a 85% TEOAE component similarity in 90% of the tested subjects.
Sanchez Sorzano, Carlos Oscar; Alvarez-Cabrera, Ana Lucia; Kazemi, Mohsen; Carazo, Jose María; Jonić, Slavica
2016-04-26
Single-particle electron microscopy (EM) has been shown to be very powerful for studying structures and associated conformational changes of macromolecular complexes. In the context of analyzing conformational changes of complexes, distinct EM density maps obtained by image analysis and three-dimensional (3D) reconstruction are usually analyzed in 3D for interpretation of structural differences. However, graphic visualization of these differences based on a quantitative analysis of elastic transformations (deformations) among density maps has not been done yet due to a lack of appropriate methods. Here, we present an approach that allows such visualization. This approach is based on statistical analysis of distances among elastically aligned pairs of EM maps (one map is deformed to fit the other map), and results in visualizing EM maps as points in a lower-dimensional distance space. The distances among points in the new space can be analyzed in terms of clusters or trajectories of points related to potential conformational changes. The results of the method are shown with synthetic and experimental EM maps at different resolutions. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Efficient Analysis of Mass Spectrometry Data Using the Isotope Wavelet
NASA Astrophysics Data System (ADS)
Hussong, Rene; Tholey, Andreas; Hildebrandt, Andreas
2007-09-01
Mass spectrometry (MS) has become today's de-facto standard for high-throughput analysis in proteomics research. Its applications range from toxicity analysis to MS-based diagnostics. Often, the time spent on the MS experiment itself is significantly less than the time necessary to interpret the measured signals, since the amount of data can easily exceed several gigabytes. In addition, automated analysis is hampered by baseline artifacts, chemical as well as electrical noise, and an irregular spacing of data points. Thus, filtering techniques originating from signal and image analysis are commonly employed to address these problems. Unfortunately, smoothing, base-line reduction, and in particular a resampling of data points can affect important characteristics of the experimental signal. To overcome these problems, we propose a new family of wavelet functions based on the isotope wavelet, which is hand-tailored for the analysis of mass spectrometry data. The resulting technique is theoretically well-founded and compares very well with standard peak picking tools, since it is highly robust against noise spoiling the data, but at the same time sufficiently sensitive to detect even low-abundant peptides.
Metabolomics is becoming well-established for studying chemical contaminant-induced alterations to normal biological function. For example, the literature contains a wealth of laboratory-based studies involving analysis of samples from organisms exposed to individual chemical tox...
Metabolomics has become well-established for studying chemical contaminant-induced alterations to normal biological function. For example, the literature contains a wealth of laboratory-based studies involving analysis of samples from organisms exposed to individual chemical toxi...
Portable point-of-care blood analysis system for global health (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dou, James J.; Aitchison, James Stewart; Chen, Lu; Nayyar, Rakesh
2016-03-01
In this paper we present a portable blood analysis system based on a disposable cartridge and hand-held reader. The platform can perform all the sample preparation, detection and waste collection required to complete a clinical test. In order to demonstrate the utility of this approach a CD4 T cell enumeration was carried out. A handheld, point-of-care CD4 T cell system was developed based on this system. In particular we will describe a pneumatic, active pumping method to control the on-chip fluidic actuation. Reagents for the CD4 T cell counting assay were dried on a reagent plug to eliminate the need for cold chain storage when used in the field. A micromixer based on the active fluidic actuation was designed to complete sample staining with fluorescent dyes that was dried on the reagent plugs. A novel image detection and analysis algorithm was developed to detect and track the flight of target particles and cells during each analysis. The handheld, point-of-care CD4 testing system was benchmarked against clinical cytometer. The experimental results demonstrated experimental results were closely matched with the flow cytometry. The same platform can be further expanded into a bead-array detection system where other types of biomolecules such as proteins can be detected using the same detection system.
Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system
NASA Astrophysics Data System (ADS)
Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio
2017-03-01
It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.
Rutzinger, Martin; Höfle, Bernhard; Hollaus, Markus; Pfeifer, Norbert
2008-01-01
Airborne laser scanning (ALS) is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (>20 echoes/m2) and additional classification variables from full-waveform (FWF) ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA) approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation) are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original measurements directly, i.e. the acquired points. Gridding of the data is not necessary, a process which is inherently coupled to loss of data and precision. The 3D properties provide especially a good separability of buildings and terrain points respectively, if they are occluded by vegetation. PMID:27873771
Fast ground filtering for TLS data via Scanline Density Analysis
NASA Astrophysics Data System (ADS)
Che, Erzhuo; Olsen, Michael J.
2017-07-01
Terrestrial Laser Scanning (TLS) efficiently collects 3D information based on lidar (light detection and ranging) technology. TLS has been widely used in topographic mapping, engineering surveying, forestry, industrial facilities, cultural heritage, and so on. Ground filtering is a common procedure in lidar data processing, which separates the point cloud data into ground points and non-ground points. Effective ground filtering is helpful for subsequent procedures such as segmentation, classification, and modeling. Numerous ground filtering algorithms have been developed for Airborne Laser Scanning (ALS) data. However, many of these are error prone in application to TLS data because of its different angle of view and highly variable resolution. Further, many ground filtering techniques are limited in application within challenging topography and experience difficulty coping with some objects such as short vegetation, steep slopes, and so forth. Lastly, due to the large size of point cloud data, operations such as data traversing, multiple iterations, and neighbor searching significantly affect the computation efficiency. In order to overcome these challenges, we present an efficient ground filtering method for TLS data via a Scanline Density Analysis, which is very fast because it exploits the grid structure storing TLS data. The process first separates the ground candidates, density features, and unidentified points based on an analysis of point density within each scanline. Second, a region growth using the scan pattern is performed to cluster the ground candidates and further refine the ground points (clusters). In the experiment, the effectiveness, parameter robustness, and efficiency of the proposed method is demonstrated with datasets collected from an urban scene and a natural scene, respectively.
Topological photonic crystal with ideal Weyl points
NASA Astrophysics Data System (ADS)
Wang, Luyang; Jian, Shao-Kai; Yao, Hong
Weyl points in three-dimensional photonic crystals behave as monopoles of Berry flux in momentum space. Here, based on symmetry analysis, we show that a minimal number of symmetry-related Weyl points can be realized in time-reversal invariant photonic crystals. We propose to realize these ``ideal'' Weyl points in modified double-gyroid photonic crystals, which is confirmed by our first-principle photonic band-structure calculations. Photonic crystals with ideal Weyl points are qualitatively advantageous in applications such as angular and frequency selectivity, broadband invisibility cloaking, and broadband 3D-imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muralidhar, K Raja; Komanduri, K
2014-06-01
Purpose: The objective of this work is to present a mechanism for calculating inflection points on profiles at various depths and field sizes and also a significant study on the percentage of doses at the inflection points for various field sizes and depths for 6XFFF and 10XFFF energy profiles. Methods: Graphical representation was done on Percentage of dose versus Inflection points. Also using the polynomial function, the authors formulated equations for calculating spot-on inflection point on the profiles for 6X FFF and 10X FFF energies for all field sizes and at various depths. Results: In a flattening filter free radiationmore » beam which is not like in Flattened beams, the dose at inflection point of the profile decreases as field size increases for 10XFFF. Whereas in 6XFFF, the dose at the inflection point initially increases up to 10x10cm2 and then decreases. The polynomial function was fitted for both FFF beams for all field sizes and depths. For small fields less than 5x5 cm2 the inflection point and FWHM are almost same and hence analysis can be done just like in FF beams. A change in 10% of dose can change the field width by 1mm. Conclusion: The present study, Derivative of equations based on the polynomial equation to define inflection point concept is precise and accurate way to derive the inflection point dose on any FFF beam profile at any depth with less than 1% accuracy. Corrections can be done in future studies based on the multiple number of machine data. Also a brief study was done to evaluate the inflection point positions with respect to dose in FFF energies for various field sizes and depths for 6XFFF and 10XFFF energy profiles.« less
Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.
Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan
2018-06-05
Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.
NASA Astrophysics Data System (ADS)
Koparan, Timur; Güven, Bülent
2015-07-01
The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.
Araki, Ryoko; Mizutani, Eiji; Hoki, Yuko; Sunayama, Misato; Wakayama, Sayaka; Nagatomo, Hiroaki; Kasama, Yasuji; Nakamura, Miki; Wakayama, Teruhiko; Abe, Masumi
2017-05-01
Induced pluripotent stem cells hold great promise for regenerative medicine but point mutations have been identified in these cells and have raised serious concerns about their safe use. We generated nuclear transfer embryonic stem cells (ntESCs) from both mouse embryonic fibroblasts (MEFs) and tail-tip fibroblasts (TTFs) and by whole genome sequencing found fewer mutations compared with iPSCs generated by retroviral gene transduction. Furthermore, TTF-derived ntESCs showed only a very small number of point mutations, approximately 80% less than the number observed in iPSCs generated using retrovirus. Base substitution profile analysis confirmed this greatly reduced number of point mutations. The point mutations in iPSCs are therefore not a Yamanaka factor-specific phenomenon but are intrinsic to genome reprogramming. Moreover, the dramatic reduction in point mutations in ntESCs suggests that most are not essential for genome reprogramming. Our results suggest that it is feasible to reduce the point mutation frequency in iPSCs by optimizing various genome reprogramming conditions. We conducted whole genome sequencing of ntES cells derived from MEFs or TTFs. We thereby succeeded in establishing TTF-derived ntES cell lines with far fewer point mutations. Base substitution profile analysis of these clones also indicated a reduced point mutation frequency, moving from a transversion-predominance to a transition-predominance. Stem Cells 2017;35:1189-1196. © 2017 AlphaMed Press.
Paper focuses on trading schemes in which regulated point sources are allowed to avoid upgrading their pollution control technology to meet water quality-based effluent limits if they pay for equivalent (or greater) reductions in nonpoint source pollution.
Liu, Bao; Fan, Xiaoming; Huo, Shengnan; Zhou, Lili; Wang, Jun; Zhang, Hui; Hu, Mei; Zhu, Jianhua
2011-12-01
A method was established to analyse the overlapped chromatographic peaks based on the chromatographic-spectra data detected by the diode-array ultraviolet detector. In the method, the three-dimensional data were de-noised and normalized firstly; secondly the differences and clustering analysis of the spectra at different time points were calculated; then the purity of the whole chromatographic peak were analysed and the region were sought out in which the spectra of different time points were stable. The feature spectra were extracted from the spectrum-stable region as the basic foundation. The nonnegative least-square method was chosen to separate the overlapped peaks and get the flow curve which was based on the feature spectrum. The three-dimensional divided chromatographic-spectrum peak could be gained by the matrix operations of the feature spectra with the flow curve. The results displayed that this method could separate the overlapped peaks.
Warrick, P A; Precup, D; Hamilton, E F; Kearney, R E
2007-01-01
To develop a singular-spectrum analysis (SSA) based change-point detection algorithm applicable to fetal heart rate (FHR) monitoring to improve the detection of deceleration events. We present a method for decomposing a signal into near-orthogonal components via the discrete cosine transform (DCT) and apply this in a novel online manner to change-point detection based on SSA. The SSA technique forms models of the underlying signal that can be compared over time; models that are sufficiently different indicate signal change points. To adapt the algorithm to deceleration detection where many successive similar change events can occur, we modify the standard SSA algorithm to hold the reference model constant under such conditions, an approach that we term "base-hold SSA". The algorithm is applied to a database of 15 FHR tracings that have been preprocessed to locate candidate decelerations and is compared to the markings of an expert obstetrician. Of the 528 true and 1285 false decelerations presented to the algorithm, the base-hold approach improved on standard SSA, reducing the number of missed decelerations from 64 to 49 (21.9%) while maintaining the same reduction in false-positives (278). The standard SSA assumption that changes are infrequent does not apply to FHR analysis where decelerations can occur successively and in close proximity; our base-hold SSA modification improves detection of these types of event series.
Point-by-point compositional analysis for atom probe tomography.
Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P
2014-01-01
This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.
The personal shopper – a pilot randomized trial of grocery store-based dietary advice
Lewis, K H; Roblin, D W; Leo, M; Block, J P
2015-01-01
The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index – 2005 [0–100 points] ), and nutritional knowledge (0–65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. PMID:25873139
The personal shopper--a pilot randomized trial of grocery store-based dietary advice.
Lewis, K H; Roblin, D W; Leo, M; Block, J P
2015-06-01
The objective of this study was to test the feasibility and preliminary efficacy of a store-based dietary education intervention against traditional clinic-based advice. Patients with obesity (n = 55, mean [standard deviation, SD] age 44.3[9.2] years, 64% women, 87% non-Hispanic Black) were randomized to receive dietary counselling either in a grocery store or a clinic. Change between groups (analysis of covariance) was assessed for outcomes including: dietary quality (Healthy Eating Index--2005 [0-100 points]), and nutritional knowledge (0-65-point knowledge scale). Both groups reported improved diet quality at the end of the study. Grocery participants had greater increases in knowledge (mean [SD] change = 5.7 [6.1] points) than clinic participants (mean [SD] change = 3.2 [4.0] points) (P = 0.04). Participants enjoyed the store-based sessions. Grocery store-based visits offer a promising approach for dietary counselling. © 2015 The Authors. Clinical Obesity published by John Wiley & Sons Ltd on behalf of World Obesity.
NASA Astrophysics Data System (ADS)
Xu, Y.; Sun, Z.; Boerner, R.; Koch, T.; Hoegner, L.; Stilla, U.
2018-04-01
In this work, we report a novel way of generating ground truth dataset for analyzing point cloud from different sensors and the validation of algorithms. Instead of directly labeling large amount of 3D points requiring time consuming manual work, a multi-resolution 3D voxel grid for the testing site is generated. Then, with the help of a set of basic labeled points from the reference dataset, we can generate a 3D labeled space of the entire testing site with different resolutions. Specifically, an octree-based voxel structure is applied to voxelize the annotated reference point cloud, by which all the points are organized by 3D grids of multi-resolutions. When automatically annotating the new testing point clouds, a voting based approach is adopted to the labeled points within multiple resolution voxels, in order to assign a semantic label to the 3D space represented by the voxel. Lastly, robust line- and plane-based fast registration methods are developed for aligning point clouds obtained via various sensors. Benefiting from the labeled 3D spatial information, we can easily create new annotated 3D point clouds of different sensors of the same scene directly by considering the corresponding labels of 3D space the points located, which would be convenient for the validation and evaluation of algorithms related to point cloud interpretation and semantic segmentation.
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-01-01
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions. PMID:25199907
1984-11-30
fluxes have been processed into a computer data base, ready for further analysis. This data base has been the starting point for several of the above...distance from the point of observation. One very common distribution consists of field-aligned ions at energies below several keV, with more energetic...BE DOW POINTS EVERY 16 SM. IN THE SECOND M, ijv.sTIGA . (3) TE FLASMA AND FIE.D COIDITIS THE ELOCTY FILIE IS LOCKED IN OW OF FOUR .fiHAT PRODUCEE TW
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye
NASA Astrophysics Data System (ADS)
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-09-01
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions.
Dew inspired breathing-based detection of genetic point mutation visualized by naked eye.
Xie, Liping; Wang, Tongzhou; Huang, Tianqi; Hou, Wei; Huang, Guoliang; Du, Yanan
2014-09-09
A novel label-free method based on breathing-induced vapor condensation was developed for detection of genetic point mutation. The dew-inspired detection was realized by integration of target-induced DNA ligation with rolling circle amplification (RCA). The vapor condensation induced by breathing transduced the RCA-amplified variances in DNA contents into visible contrast. The image could be recorded by a cell phone for further or even remote analysis. This green assay offers a naked-eye-reading method potentially applied for point-of-care liver cancer diagnosis in resource-limited regions.
Image registration with uncertainty analysis
Simonson, Katherine M [Cedar Crest, NM
2011-03-22
In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.
Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Heine, Christian; Weber, Gunther H.
2012-05-04
Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas
2016-10-01
In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.
Analysis and design of wedge projection display system based on ray retracing method.
Lee, Chang-Kun; Lee, Taewon; Sung, Hyunsik; Min, Sung-Wook
2013-06-10
A design method for the wedge projection display system based on the ray retracing method is proposed. To analyze the principle of image formation on the inclined surface of the wedge-shaped waveguide, the bundle of rays is retraced from an imaging point on the inclined surface to the aperture of the waveguide. In consequence of ray retracing, we obtain the incident conditions of the ray, such as the position and the angle at the aperture, which provide clues for image formation. To illuminate the image formation, the concept of the equivalent imaging point is proposed, which is the intersection where the incident rays are extended over the space regardless of the refraction and reflection in the waveguide. Since the initial value of the rays arriving at the equivalent imaging point corresponds to that of the rays converging into the imaging point on the inclined surface, the image formation can be visualized by calculating the equivalent imaging point over the entire inclined surface. Then, we can find image characteristics, such as their size and position, and their degree of blur--by analyzing the distribution of the equivalent imaging point--and design the optimized wedge projection system by attaching the prism structure at the aperture. The simulation results show the feasibility of the ray retracing analysis and characterize the numerical relation between the waveguide parameters and the aperture structure for on-axis configuration. The experimental results verify the designed system based on the proposed method.
Enhancing the Impact of Quality Points in Interteaching
ERIC Educational Resources Information Center
Rosales, Rocío; Soldner, James L.; Crimando, William
2014-01-01
Interteaching is a classroom instruction approach based on behavioral principles that offers increased flexibility to instructors. There are several components of interteaching that may contribute to its demonstrated efficacy. In a prior analysis of one of these components, the quality points contingency, no significant difference was reported in…
Campbell, Jared M; Umapathysivam, Kandiah; Xue, Yifan; Lockwood, Craig
2015-12-01
Clinicians and other healthcare professionals need access to summaries of evidence-based information in order to provide effective care to their patients at the point-of-care. Evidence-based practice (EBP) point-of-care resources have been developed and are available online to meet this need. This study aimed to develop a comprehensive list of available EBP point-of-care resources and evaluate their processes and policies for the development of content, in order to provide a critical analysis based upon rigor, transparency and measures of editorial quality to inform healthcare providers and promote quality improvement amongst publishers of EBP resources. A comprehensive and systematic search (Pubmed, CINAHL, and Cochrane Central) was undertaken to identify available EBP point-of-care resources, defined as "web-based medical compendia specifically designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, and evidence-based information (and possibly also guidance) to clinicians." A pair of investigators independently extracted information on general characteristics, content presentation, editorial quality, evidence-based methodology, and breadth and volume. Twenty-seven summary resources were identified, of which 22 met the predefined inclusion criteria for EBP point-of-care resources, and 20 could be accessed for description and assessment. Overall, the upper quartile of EBP point-of-care providers was assessed to be UpToDate, Nursing Reference Centre, Mosby's Nursing Consult, BMJ Best Practice, and JBI COnNECT+. The choice of which EBP point-of-care resources are suitable for an organization is a decision that depends heavily on the unique requirements of that organization and the resources it has available. However, the results presented in this study should enable healthcare providers to make that assessment in a clear, evidence-based manner, and provide a comprehensive list of the available options. © 2015 Sigma Theta Tau International.
Monitoring urban subsidence based on SAR lnterferometric point target analysis
Zhang, Y.; Zhang, Jiahua; Gong, W.; Lu, Z.
2009-01-01
lnterferometric point target analysis (IPTA) is one of the latest developments in radar interferometric processing. It is achieved by analysis of the interferometric phases of some individual point targets, which are discrete and present temporarily stable backscattering characteristics, in long temporal series of interferometric SAR images. This paper analyzes the interferometric phase model of point targets, and then addresses two key issues within IPTA process. Firstly, a spatial searching method is proposed to unwrap the interferometric phase difference between two neighboring point targets. The height residual error and linear deformation rate of each point target can then be calculated, when a global reference point with known height correction and deformation history is chosen. Secondly, a spatial-temporal filtering scheme is proposed to further separate the atmosphere phase and nonlinear deformation phase from the residual interferometric phase. Finally, an experiment of the developed IPTA methodology is conducted over Suzhou urban area. Totally 38 ERS-1/2 SAR scenes are analyzed, and the deformation information over 3 546 point targets in the time span of 1992-2002 are generated. The IPTA-derived deformation shows very good agreement with the published result, which demonstrates that the IPTA technique can be developed into an operational tool to map the ground subsidence over urban area.
The value of spatial analysis for tracking supply for family planning: the case of Kinshasa, DRC.
Hernandez, Julie H; Akilimali, Pierre; Kayembe, Patrick; Dikamba, Nelly; Bertrand, Jane
2016-10-01
While geographic information systems (GIS) are frequently used to research accessibility issues for healthcare services around the world, sophisticated spatial analysis protocols and outputs often prove inappropriate and unsustainable to support evidence-based programme strategies in resource-constrained environments. This article examines how simple, open-source and interactive GIS tools have been used to locate family planning (FP) services delivery points in Kinshasa (Democratic Republic of Congo) and to identify underserved areas, determining the potential location of new service points, and to support advocacy for FP programmes. Using smartphone-based data collection applications (OpenDataKit), we conducted two surveys of FP facilities supported by partner organizations in 2012 and 2013 and used the results to assess gaps in FP services coverage, using both ratio of facilities per population and distance-based accessibility criteria. The cartographic outputs included both static analysis maps and interactive Google Earth displays, and sought to support advocacy and evidence-based planning for the placement of new service points. These maps, at the scale of Kinshasa or for each of the 35 health zones that cover the city, garnered a wide interest from the operational level of the health zones' Chief Medical Officers, who were consulted to contribute field knowledge on potential new service delivery points, to the FP programmes officers at the Ministry of Health, who could use the map to inform resources allocation decisions throughout the city. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
2.5D multi-view gait recognition based on point cloud registration.
Tang, Jin; Luo, Jian; Tjahjadi, Tardi; Gao, Yan
2014-03-28
This paper presents a method for modeling a 2.5-dimensional (2.5D) human body and extracting the gait features for identifying the human subject. To achieve view-invariant gait recognition, a multi-view synthesizing method based on point cloud registration (MVSM) to generate multi-view training galleries is proposed. The concept of a density and curvature-based Color Gait Curvature Image is introduced to map 2.5D data onto a 2D space to enable data dimension reduction by discrete cosine transform and 2D principle component analysis. Gait recognition is achieved via a 2.5D view-invariant gait recognition method based on point cloud registration. Experimental results on the in-house database captured by a Microsoft Kinect camera show a significant performance gain when using MVSM.
NASA Technical Reports Server (NTRS)
Lovelace, Jeffrey J.; Cios, Kryzsztof J.; Roth, Don J.; cAO, wEI n.
2001-01-01
Post-Scan Interactive Data Display (PSIDD) III is a user-oriented Windows-based system that facilitates the display and comparison of ultrasonic contact measurement data obtained at NASA Glenn Research Center's Ultrasonic Nondestructive Evaluation measurement facility. The system is optimized to compare ultrasonic measurements made at different locations within a material or at different stages of material degradation. PSIDD III provides complete analysis of the primary waveforms in the time and frequency domains along with the calculation of several frequency-dependent properties including phase velocity and attenuation coefficient and several frequency-independent properties, like the cross correlation velocity. The system allows image generation on all the frequency-dependent properties at any available frequency (limited by the bandwidth used in the scans) and on any of the frequency-independent properties. From ultrasonic contact scans, areas of interest on an image can be studied with regard to underlying raw waveforms and derived ultrasonic properties by simply selecting the point on the image. The system offers various modes of indepth comparison between scan points. Up to five scan points can be selected for comparative analysis at once. The system was developed with Borland Delphi software (Visual Pascal) and is based on an SQL data base. It is ideal for the classification of material properties or the location of microstructure variations in materials. Along with the ultrasonic contact measurement software that it is partnered with, this system is technology ready and can be transferred to users worldwide.
Wang, Gang; Wang, Yalin
2017-02-15
In this paper, we propose a heat kernel based regional shape descriptor that may be capable of better exploiting volumetric morphological information than other available methods, thereby improving statistical power on brain magnetic resonance imaging (MRI) analysis. The mechanism of our analysis is driven by the graph spectrum and the heat kernel theory, to capture the volumetric geometry information in the constructed tetrahedral meshes. In order to capture profound brain grey matter shape changes, we first use the volumetric Laplace-Beltrami operator to determine the point pair correspondence between white-grey matter and CSF-grey matter boundary surfaces by computing the streamlines in a tetrahedral mesh. Secondly, we propose multi-scale grey matter morphology signatures to describe the transition probability by random walk between the point pairs, which reflects the inherent geometric characteristics. Thirdly, a point distribution model is applied to reduce the dimensionality of the grey matter morphology signatures and generate the internal structure features. With the sparse linear discriminant analysis, we select a concise morphology feature set with improved classification accuracies. In our experiments, the proposed work outperformed the cortical thickness features computed by FreeSurfer software in the classification of Alzheimer's disease and its prodromal stage, i.e., mild cognitive impairment, on publicly available data from the Alzheimer's Disease Neuroimaging Initiative. The multi-scale and physics based volumetric structure feature may bring stronger statistical power than some traditional methods for MRI-based grey matter morphology analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Manufacturing Analysis | Energy Analysis | NREL
, state, and community levels. Solar photovoltaic manufacturing cost analysis Examining the regional competitiveness of solar photovoltaic manufacturing points to access to capital as a critical component for scale of rare material-based photovoltaic PV technology deployment may influence the United States
NASA Technical Reports Server (NTRS)
1972-01-01
The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.
Sahu, P P
2008-02-10
A thermally tunable erbium-doped fiber amplifier (EDFA) gain equalizer filter based on compact point symmetric cascaded Mach-Zehnder (CMZ) coupler is presented with its mathematical model and is found to be polarization dependent due to stress anisotropy caused by local heating for thermo-optic phase change from its mathematical analysis. A thermo-optic delay line structure with a stress releasing groove is proposed and designed for the reduction of polarization dependent characteristics of the high index contrast point symmetric delay line structure of the device. It is found from thermal analysis by using an implicit finite difference method that temperature gradients of the proposed structure, which mainly causes the release of stress anisotropy, is approximately nine times more than that of the conventional structure. It is also seen that the EDFA gain equalized spectrum by using the point symmetric CMZ device based on the proposed structure is almost polarization independent.
NASA Astrophysics Data System (ADS)
Liu, Youshan; Teng, Jiwen; Xu, Tao; Badal, José
2017-05-01
The mass-lumped method avoids the cost of inverting the mass matrix and simultaneously maintains spatial accuracy by adopting additional interior integration points, known as cubature points. To date, such points are only known analytically in tensor domains, such as quadrilateral or hexahedral elements. Thus, the diagonal-mass-matrix spectral element method (SEM) in non-tensor domains always relies on numerically computed interpolation points or quadrature points. However, only the cubature points for degrees 1 to 6 are known, which is the reason that we have developed a p-norm-based optimization algorithm to obtain higher-order cubature points. In this way, we obtain and tabulate new cubature points with all positive integration weights for degrees 7 to 9. The dispersion analysis illustrates that the dispersion relation determined from the new optimized cubature points is comparable to that of the mass and stiffness matrices obtained by exact integration. Simultaneously, the Lebesgue constant for the new optimized cubature points indicates its surprisingly good interpolation properties. As a result, such points provide both good interpolation properties and integration accuracy. The Courant-Friedrichs-Lewy (CFL) numbers are tabulated for the conventional Fekete-based triangular spectral element (TSEM), the TSEM with exact integration, and the optimized cubature-based TSEM (OTSEM). A complementary study demonstrates the spectral convergence of the OTSEM. A numerical example conducted on a half-space model demonstrates that the OTSEM improves the accuracy by approximately one order of magnitude compared to the conventional Fekete-based TSEM. In particular, the accuracy of the 7th-order OTSEM is even higher than that of the 14th-order Fekete-based TSEM. Furthermore, the OTSEM produces a result that can compete in accuracy with the quadrilateral SEM (QSEM). The high accuracy of the OTSEM is also tested with a non-flat topography model. In terms of computational efficiency, the OTSEM is more efficient than the Fekete-based TSEM, although it is slightly costlier than the QSEM when a comparable numerical accuracy is required.
Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services
NASA Astrophysics Data System (ADS)
Collins, Patrick; Bahr, Thomas
2016-04-01
The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of the DEM difference to analyze the surface changes in 3D. The automated point cloud generation and analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the point cloud processing tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study allow a 3D estimation of the topographic changes within the tectonically active and anthropogenically invaded Malin area after the landslide event. Accordingly, the point cloud analysis was correlated successfully with modelled displacement contours of the slope. Based on optical satellite imagery, such point clouds of high precision and density distribution can be obtained in a few minutes to support the operational monitoring of landslide processes.
Funding Ohio Community Colleges: An Analysis of the Performance Funding Model
ERIC Educational Resources Information Center
Krueger, Cynthia A.
2013-01-01
This study examined Ohio's community college performance funding model that is based on seven student success metrics. A percentage of the regular state subsidy is withheld from institutions; funding is earned back based on the three-year average of success points achieved in comparison to other community colleges in the state. Analysis of…
ERIC Educational Resources Information Center
Park, Sanghoon
2017-01-01
This paper reports the findings of a comparative analysis of online learner behavioral interactions, time-on-task, attendance, and performance at different points throughout a semester (beginning, during, and end) based on two online courses: one course offering authentic discussion-based learning activities and the other course offering authentic…
Williams, Abimbola Onigbanjo; Makinde, Olusesan Ayodeji; Ojo, Mojisola
2016-01-01
Multidrug drug resistant Tuberculosis (MDR-TB) and extensively drug resistant Tuberculosis (XDR-TB) have emerged as significant public health threats worldwide. This systematic review and meta-analysis aimed to investigate the effects of community-based treatment to traditional hospitalization in improving treatment success rates among MDR-TB and XDR-TB patients in the 27 MDR-TB High burden countries (HBC). We searched PubMed, Cochrane, Lancet, Web of Science, International Journal of Tuberculosis and Lung Disease, and Centre for Reviews and Dissemination (CRD) for studies on community-based treatment and traditional hospitalization and MDR-TB and XDR-TB from the 27 MDR-TB HBC. Data on treatment success and failure rates were extracted from retrospective and prospective cohort studies, and a case control study. Sensitivity analysis, subgroup analyses, and meta-regression analysis were used to explore bias and potential sources of heterogeneity. The final sample included 16 studies involving 3344 patients from nine countries; Bangladesh, China, Ethiopia, Kenya, India, South Africa, Philippines, Russia, and Uzbekistan. Based on a random-effects model, we observed a higher treatment success rate in community-based treatment (Point estimate = 0.68, 95 % CI: 0.59 to 0.76, p < 0.01) compared to traditional hospitalization (Point estimate = 0.57, 95 % CI: 0.44 to 0.69, p < 0.01). A lower treatment failure rate was observed in community-based treatment 7 % (Point estimate = 0.07, 95 % CI: 0.03 to 0.10; p < 0.01) compared to traditional hospitalization (Point estimate = 0.188, 95 % CI: 0.10 to 0.28; p < 0.01). In the subgroup analysis, studies without HIV co-infected patients, directly observed therapy short course-plus (DOTS-Plus) implemented throughout therapy, treatment duration > 18 months, and regimen with drugs >5 reported higher treatment success rate. In the meta-regression model, age of patients, adverse events, treatment duration, and lost to follow up explains some of the heterogeneity of treatment effects between studies. Community-based management improved treatment outcomes. A mix of interventions with DOTS-Plus throughout therapy and treatment duration > 18 months as well as strategies in place for lost to follow up and adverse events should be considered in MDR-TB and XDR-TB interventions, as they influenced positively, treatment success.
Thermal analysis and microstructural characterization of Mg-Al-Zn system alloys
NASA Astrophysics Data System (ADS)
Król, M.; Tański, T.; Sitek, W.
2015-11-01
The influence of Zn amount and solidification rate on the characteristic temperature of the evaluation of magnesium dendrites during solidification at different cooling rates (0.6-2.5°C) were examined by thermal derivative analysis (TDA). The dendrite coherency point (DCP) is presented with a novel approach based on second derivative cooling curve. Solidification behavior was examined via one thermocouple thermal analysis method. Microstructural assessments were described by optical light microscopy, scanning electron microscopy and energy dispersive X-ray spectroscopy. These studies showed that utilization of d2T/dt2 vs. the time curve methodology provides for analysis of the dendrite coherency point
Wu, Dan; Faria, Andreia V; Younes, Laurent; Mori, Susumu; Brown, Timothy; Johnson, Hans; Paulsen, Jane S; Ross, Christopher A; Miller, Michael I
2017-10-01
Huntington's disease (HD) is an autosomal dominant neurodegenerative disorder that progressively affects motor, cognitive, and emotional functions. Structural MRI studies have demonstrated brain atrophy beginning many years prior to clinical onset ("premanifest" period), but the order and pattern of brain structural changes have not been fully characterized. In this study, we investigated brain regional volumes and diffusion tensor imaging (DTI) measurements in premanifest HD, and we aim to determine (1) the extent of MRI changes in a large number of structures across the brain by atlas-based analysis, and (2) the initiation points of structural MRI changes in these brain regions. We adopted a novel multivariate linear regression model to detect the inflection points at which the MRI changes begin (namely, "change-points"), with respect to the CAG-age product (CAP, an indicator of extent of exposure to the effects of CAG repeat expansion). We used approximately 300 T1-weighted and DTI data from premanifest HD and control subjects in the PREDICT-HD study, with atlas-based whole brain segmentation and change-point analysis. The results indicated a distinct topology of structural MRI changes: the change-points of the volumetric measurements suggested a central-to-peripheral pattern of atrophy from the striatum to the deep white matter; and the change points of DTI measurements indicated the earliest changes in mean diffusivity in the deep white matter and posterior white matter. While interpretation needs to be cautious given the cross-sectional nature of the data, these findings suggest a spatial and temporal pattern of spread of structural changes within the HD brain. Hum Brain Mapp 38:5035-5050, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
A method for automatic feature points extraction of human vertebrae three-dimensional model
NASA Astrophysics Data System (ADS)
Wu, Zhen; Wu, Junsheng
2017-05-01
A method for automatic extraction of the feature points of the human vertebrae three-dimensional model is presented. Firstly, the statistical model of vertebrae feature points is established based on the results of manual vertebrae feature points extraction. Then anatomical axial analysis of the vertebrae model is performed according to the physiological and morphological characteristics of the vertebrae. Using the axial information obtained from the analysis, a projection relationship between the statistical model and the vertebrae model to be extracted is established. According to the projection relationship, the statistical model is matched with the vertebrae model to get the estimated position of the feature point. Finally, by analyzing the curvature in the spherical neighborhood with the estimated position of feature points, the final position of the feature points is obtained. According to the benchmark result on multiple test models, the mean relative errors of feature point positions are less than 5.98%. At more than half of the positions, the error rate is less than 3% and the minimum mean relative error is 0.19%, which verifies the effectiveness of the method.
Topological photonic crystal with equifrequency Weyl points
NASA Astrophysics Data System (ADS)
Wang, Luyang; Jian, Shao-Kai; Yao, Hong
2016-06-01
Weyl points in three-dimensional photonic crystals behave as monopoles of Berry flux in momentum space. Here, based on general symmetry analysis, we show that a minimal number of four symmetry-related (consequently equifrequency) Weyl points can be realized in time-reversal invariant photonic crystals. We further propose an experimentally feasible way to modify double-gyroid photonic crystals to realize four equifrequency Weyl points, which is explicitly confirmed by our first-principle photonic band-structure calculations. Remarkably, photonic crystals with equifrequency Weyl points are qualitatively advantageous in applications including angular selectivity, frequency selectivity, invisibility cloaking, and three-dimensional imaging.
Fulzele, Punit; Baliga, Sudhindra; Thosar, Nilima; Pradhan, Debaprya
2011-01-01
Aims: Evaluation of calcium ion and hydroxyl ion release and pH levels in various calcium hydroxide based intracanal medicaments. Objective: The purpose of this study was to evaluate calcium and hydroxyl ion release and pH levels of calcium hydroxide based products, namely, RC Cal, Metapex, calcium hydroxide with distilled water, along with the new gutta-percha points with calcium hydroxide. Materials and Methods: The materials were inserted in polyethylene tubes and immersed in deionized water. The pH variation, Ca++ and OH- release were monitored periodically for 1 week. Statistical Analysis Used: Statistical analysis was carried out using one-way analysis of variance and Tukey's post hoc tests with PASW Statistics version 18 software to compare the statistical difference. Results: After 1 week, calcium hydroxide with distilled water and RC Cal raised the pH to 12.7 and 11.8, respectively, while a small change was observed for Metapex, calcium hydroxide gutta-percha points. The calcium released after 1 week was 15.36 mg/dL from RC Cal, followed by 13.04, 1.296, 3.064 mg/dL from calcium hydroxide with sterile water, Metapex and calcium hydroxide gutta-percha points, respectively. Conclusions: Calcium hydroxide with sterile water and RC Cal pastes liberate significantly more calcium and hydroxyl ions and raise the pH higher than Metapex and calcium hydroxidegutta-percha points. PMID:22346155
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdolmaleki, Amir, E-mail: abdolmaleki@cc.iut.ac.ir; Nanotechnology and Advanced Materials Institute, Isfahan University of Technology, Isfahan 84156-83111, Islamic Republic of Iran; Mallakpour, Shadpour, E-mail: mallak@cc.iut.ac.ir
Highlights: Black-Right-Pointing-Pointer A novel biodegradable and nanostructured PAEI based on two amino acids, was synthesized. Black-Right-Pointing-Pointer ZnO nanoparticles were modified via two different silane coupling agents. Black-Right-Pointing-Pointer PAEI/modified ZnO BNCs were synthesized through ultrasound irradiation. Black-Right-Pointing-Pointer ZnO particles were dispersed homogeneously in PAEI matrix on nanoscale. Black-Right-Pointing-Pointer The effect of ZnO nanoparticles on the properties of synthesized polymer was examined. -- Abstract: A novel biodegradable and nanostructured poly(amide-ester-imide) (PAEI) based on two different amino acids, was synthesized via direct polycondensation of biodegradable N,N Prime -bis[2-(methyl-3-(4-hydroxyphenyl)propanoate)]isophthaldiamide and N,N Prime -(pyromellitoyl)-bis-L-phenylalanine diacid. The resulting polymer was characterized by FT-IR, {sup 1}H NMR,more » specific rotation, elemental analysis, thermogravimetric analysis (TGA), differential scanning calorimetry (DSC), X-ray diffraction (XRD) and field emission scanning electron microscopy (FE-SEM) analysis. The synthesized polymer showed good thermal stability with nano and sphere structure. Then PAEI/ZnO bionanocomposites (BNCs) were fabricated via interaction of pure PAEI and ZnO nanoparticles. The surface of ZnO was modified with two different silane coupling agents. PAEI/ZnO BNCs were studied and characterized by FT-IR, XRD, UV/vis, FE-SEM and TEM. The TEM and FE-SEM results indicated that the nanoparticles were dispersed homogeneously in PAEI matrix on nanoscale. Furthermore the effect of ZnO nanoparticle on the thermal stability of the polymer was investigated with TGA and DSC technique.« less
Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso
2017-03-15
Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood.
From Invention to Innovation: Risk Analysis to Integrate One Health Technology in the Dairy Farm.
Lombardo, Andrea; Boselli, Carlo; Amatiste, Simonetta; Ninci, Simone; Frazzoli, Chiara; Dragone, Roberto; De Rossi, Alberto; Grasso, Gerardo; Mantovani, Alberto; Brajon, Giovanni
2017-01-01
Current Hazard Analysis Critical Control Points (HACCP) approaches mainly fit for food industry, while their application in primary food production is still rudimentary. The European food safety framework calls for science-based support to the primary producers' mandate for legal, scientific, and ethical responsibility in food supply. The multidisciplinary and interdisciplinary project ALERT pivots on the development of the technological invention (BEST platform) and application of its measurable (bio)markers-as well as scientific advances in risk analysis-at strategic points of the milk chain for time and cost-effective early identification of unwanted and/or unexpected events of both microbiological and toxicological nature. Health-oriented innovation is complex and subject to multiple variables. Through field activities in a dairy farm in central Italy, we explored individual components of the dairy farm system to overcome concrete challenges for the application of translational science in real life and (veterinary) public health. Based on an HACCP-like approach in animal production, the farm characterization focused on points of particular attention (POPAs) and critical control points to draw a farm management decision tree under the One Health view (environment, animal health, food safety). The analysis was based on the integrated use of checklists (environment; agricultural and zootechnical practices; animal health and welfare) and laboratory analyses of well water, feed and silage, individual fecal samples, and bulk milk. The understanding of complex systems is a condition to accomplish true innovation through new technologies. BEST is a detection and monitoring system in support of production security, quality and safety: a grid of its (bio)markers can find direct application in critical points for early identification of potential hazards or anomalies. The HACCP-like self-monitoring in primary production is feasible, as well as the biomonitoring of live food producing animals as sentinel population for One Health.
Bingemann, Dieter; Allen, Rachel M.
2012-01-01
We describe a statistical method to analyze dual-channel photon arrival trajectories from single molecule spectroscopy model-free to identify break points in the intensity ratio. Photons are binned with a short bin size to calculate the logarithm of the intensity ratio for each bin. Stochastic photon counting noise leads to a near-normal distribution of this logarithm and the standard student t-test is used to find statistically significant changes in this quantity. In stochastic simulations we determine the significance threshold for the t-test’s p-value at a given level of confidence. We test the method’s sensitivity and accuracy indicating that the analysis reliably locates break points with significant changes in the intensity ratio with little or no error in realistic trajectories with large numbers of small change points, while still identifying a large fraction of the frequent break points with small intensity changes. Based on these results we present an approach to estimate confidence intervals for the identified break point locations and recommend a bin size to choose for the analysis. The method proves powerful and reliable in the analysis of simulated and actual data of single molecule reorientation in a glassy matrix. PMID:22837704
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Baldwin, John R.; Borden, Charles S.
2010-01-01
The Ground Resource Allocation and Planning Environment (GRAPE 1.0) is a Web-based, collaborative team environment based on the Microsoft SharePoint platform, which provides Deep Space Network (DSN) resource planners tools and services for sharing information and performing analysis.
Sugiyama, Takehiro; Steers, William Neil; Wenger, Neil S; Duru, Obidiugwu Kenrik; Mangione, Carol M
2015-03-22
There is a paucity of evidence supporting the effectiveness of diabetes self-management education (DSME) in improving mental health-related quality of life (HRQoL) for African American and Latinos. Also, among studies supporting the favorable effects of DSME on mental HRQoL, the direct effect of DSME that is independent of improved glycemic control has never been investigated. The objectives of this study were to investigate the effect of community-based DSME intervention targeting empowerment on mental HRQoL and to determine whether the effect is direct or mediated by glycemic control. We conducted secondary analyses of data from the Diabetes Self-Care Study, a randomized controlled trial of a community-based DSME intervention. Study participants (n = 516) were African Americans and Latinos 55 years or older with poorly controlled diabetes (HbA1c ≥ 8.0%) recruited from senior centers and churches in Los Angeles. The intervention group received six weekly small-group self-care sessions based on the empowerment model. The control group received six lectures on unrelated geriatrics topics. The primary outcome variable in this secondary analysis was the change in Mental Component Summary score (MCS-12) from the SF-12 Health Survey between baseline and six-month follow-up. We used the change in HbA1c during the study period as the main mediator of interest in our causal mediation analysis. Additionally, possible mediations via social support and perceived empowerment attributable to the program were examined. MCS-12 increased by 1.4 points on average in the intervention group and decreased by 0.2 points in the control group (difference-in-change: 1.6 points, 95% CI: 0.1 to 3.2). In the causal mediation analysis, the intervention had a direct effect on MCS-12 improvement (1.7 points, 95% CI: 0.2 to 3.2) with no indirect effects mediated via HbA1c change (-0.1 points, 95% CI: -0.4 to 0.1), social support (0.1 points), and perception of empowerment (0.1 points). This Diabetes Self-Care Study empowerment intervention had a modest positive impact on mental HRQoL not mediated by the improvement in glycemic control, as well as social support and perception of empowerment. This favorable effect on mental HRQoL may be a separate clinical advantage of this DSME intervention. ClinicalTrial.gov NCT00263835.
Structural Analysis of Single-Point Mutations Given an RNA Sequence: A Case Study with RNAMute
NASA Astrophysics Data System (ADS)
Churkin, Alexander; Barash, Danny
2006-12-01
We introduce here for the first time the RNAMute package, a pattern-recognition-based utility to perform mutational analysis and detect vulnerable spots within an RNA sequence that affect structure. Mutations in these spots may lead to a structural change that directly relates to a change in functionality. Previously, the concept was tried on RNA genetic control elements called "riboswitches" and other known RNA switches, without an organized utility that analyzes all single-point mutations and can be further expanded. The RNAMute package allows a comprehensive categorization, given an RNA sequence that has functional relevance, by exploring the patterns of all single-point mutants. For illustration, we apply the RNAMute package on an RNA transcript for which individual point mutations were shown experimentally to inactivate spectinomycin resistance in Escherichia coli. Functional analysis of mutations on this case study was performed experimentally by creating a library of point mutations using PCR and screening to locate those mutations. With the availability of RNAMute, preanalysis can be performed computationally before conducting an experiment.
Brasil, Albert Vincent Berthier; Teles, Alisson R; Roxo, Marcelo Ricardo; Schuster, Marcelo Neutzling; Zauk, Eduardo Ballverdu; Barcellos, Gabriel da Costa; Costa, Pablo Ramon Fruett da; Ferreira, Nelson Pires; Kraemer, Jorge Luiz; Ferreira, Marcelo Paglioli; Gobbato, Pedro Luis; Worm, Paulo Valdeci
2016-10-01
To analyze the cumulative effect of risk factors associated with early major complications in postoperative spine surgery. Retrospective analysis of 583 surgically-treated patients. Early "major" complications were defined as those that may lead to permanent detrimental effects or require further significant intervention. A balanced risk score was built using multiple logistic regression. Ninety-two early major complications occurred in 76 patients (13%). Age > 60 years and surgery of three or more levels proved to be significant independent risk factors in the multivariate analysis. The balanced scoring system was defined as: 0 points (no risk factor), 2 points (1 factor) or 4 points (2 factors). The incidence of early major complications in each category was 7% (0 points), 15% (2 points) and 29% (4 points) respectively. This balanced scoring system, based on two risk factors, represents an important tool for both surgical indication and for patient counseling before surgery.
Tabb, Keri L.; Hellwege, Jacklyn N.; Palmer, Nicholette D.; Dimitrov, Latchezar; Sajuthi, Satria; Taylor, Kent D.; NG, Maggie C.Y.; Hawkins, Gregory A.; Chen, Yii-Der Ida; Brown, W. Mark; McWilliams, David; Williams, Adrienne; Lorenzo, Carlos; Norris, Jill M.; Long, Jirong; Rotter, Jerome I.; Curran, Joanne E.; Blangero, John; Wagenknecht, Lynne E.; Langefeld, Carl D.; Bowden, Donald W.
2017-01-01
Summary Family-based methods are a potentially powerful tool to identify trait-defining genetic variants in extended families, particularly when used to complement conventional association analysis. We utilized two-point linkage analysis and single variant association analysis to evaluate whole exome sequencing (WES) data from 1,205 Hispanic Americans (78 families) from the Insulin Resistance Atherosclerosis Family Study. WES identified 211,612 variants above the minor allele frequency threshold of ≥0.005. These variants were tested for linkage and/or association with 50 cardiometabolic traits after quality control checks. Two-point linkage analysis yielded 10,580,600 LOD scores with 1,148 LOD scores ≥3, 183 LOD scores ≥4, and 29 LOD scores ≥5. The maximal novel LOD score was 5.50 for rs2289043:T>C, in UNC5C with subcutaneous adipose tissue volume. Association analysis identified 13 variants attaining genome-wide significance (p<5×10-08), with the strongest association between rs651821:C>T in APOA5, and triglyceride levels (p=3.67×10-10). Overall, there was a 5.2-fold increase in the number of informative variants detected by WES compared to exome chip analysis in this population, nearly 30% of which were novel variants relative to dbSNP build 138. Thus, integration of results from two-point linkage and single-variant association analysis from WES data enabled identification of novel signals potentially contributing to cardiometabolic traits. PMID:28067407
[Analysis of ancient literature on baliao points for pelvic floor diseases].
Liu, Hairong; Zhang, Jianbin
2016-12-12
The relationship between baliao points and pelvis floor diseases was explored based on the ancient literature review on these acupoints' targeted diseases. It is considered that baliao points are applied to treat various pelvis floor diseases and symptoms of different systems. Each point has similar function but with unique feature. Shangliao (BL 31) is mainly used to treat gynecologic diseases;Ciliao (BL 32) and Zhongliao (BL 33),urologic system and reproductive system diseases;Zhongliao (BL 33) and Xialiao (BL 34),reproductive system and anorectal system diseases.
Determining the Number of Clusters in a Data Set Without Graphical Interpretation
NASA Technical Reports Server (NTRS)
Aguirre, Nathan S.; Davies, Misty D.
2011-01-01
Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,
2.5D Multi-View Gait Recognition Based on Point Cloud Registration
Tang, Jin; Luo, Jian; Tjahjadi, Tardi; Gao, Yan
2014-01-01
This paper presents a method for modeling a 2.5-dimensional (2.5D) human body and extracting the gait features for identifying the human subject. To achieve view-invariant gait recognition, a multi-view synthesizing method based on point cloud registration (MVSM) to generate multi-view training galleries is proposed. The concept of a density and curvature-based Color Gait Curvature Image is introduced to map 2.5D data onto a 2D space to enable data dimension reduction by discrete cosine transform and 2D principle component analysis. Gait recognition is achieved via a 2.5D view-invariant gait recognition method based on point cloud registration. Experimental results on the in-house database captured by a Microsoft Kinect camera show a significant performance gain when using MVSM. PMID:24686727
NASA Astrophysics Data System (ADS)
Majkráková, Miroslava; Papčo, Juraj; Zahorec, Pavol; Droščák, Branislav; Mikuška, Ján; Marušiak, Ivan
2016-09-01
The vertical reference system in the Slovak Republic is realized by the National Levelling Network (NLN). The normal heights according to Molodensky have been introduced as reference heights in the NLN in 1957. Since then, the gravity correction, which is necessary to determine the reference heights in the NLN, has been obtained by an interpolation either from the simple or complete Bouguer anomalies. We refer to this method as the "original". Currently, the method based on geopotential numbers is the preferred way to unify the European levelling networks. The core of this article is an analysis of different ways to the gravity determination and their application for the calculation of geopotential numbers at the points of the NLN. The first method is based on the calculation of gravity at levelling points from the interpolated values of the complete Bouguer anomaly using the CBA2G_SK software. The second method is based on the global geopotential model EGM2008 improved by the Residual Terrain Model (RTM) approach. The calculated gravity is used to determine the normal heights according to Molodensky along parts of the levelling lines around the EVRF2007 datum point EH-V. Pitelová (UELN-1905325) and the levelling line of the 2nd order NLN to Kráľova hoľa Mountain (the highest point measured by levelling). The results from our analysis illustrate that the method based on the interpolated value of gravity is a better method for gravity determination when we do not know the measured gravity. It was shown that this method is suitable for the determination of geopotential numbers and reference heights in the Slovak national levelling network at the points in which the gravity is not observed directly. We also demonstrated the necessity of using the precise RTM for the refinement of the results derived solely from the EGM2008.
NASA Astrophysics Data System (ADS)
Feng, Guixiang; Ming, Dongping; Wang, Min; Yang, Jianyu
2017-06-01
Scale problems are a major source of concern in the field of remote sensing. Since the remote sensing is a complex technology system, there is a lack of enough cognition on the connotation of scale and scale effect in remote sensing. Thus, this paper first introduces the connotations of pixel-based scale and summarizes the general understanding of pixel-based scale effect. Pixel-based scale effect analysis is essentially important for choosing the appropriate remote sensing data and the proper processing parameters. Fractal dimension is a useful measurement to analysis pixel-based scale. However in traditional fractal dimension calculation, the impact of spatial resolution is not considered, which leads that the scale effect change with spatial resolution can't be clearly reflected. Therefore, this paper proposes to use spatial resolution as the modified scale parameter of two fractal methods to further analyze the pixel-based scale effect. To verify the results of two modified methods (MFBM (Modified Windowed Fractal Brownian Motion Based on the Surface Area) and MDBM (Modified Windowed Double Blanket Method)); the existing scale effect analysis method (information entropy method) is used to evaluate. And six sub-regions of building areas and farmland areas were cut out from QuickBird images to be used as the experimental data. The results of the experiment show that both the fractal dimension and information entropy present the same trend with the decrease of spatial resolution, and some inflection points appear at the same feature scales. Further analysis shows that these feature scales (corresponding to the inflection points) are related to the actual sizes of the geo-object, which results in fewer mixed pixels in the image, and these inflection points are significantly indicative of the observed features. Therefore, the experiment results indicate that the modified fractal methods are effective to reflect the pixel-based scale effect existing in remote sensing data and it is helpful to analyze the observation scale from different aspects. This research will ultimately benefit for remote sensing data selection and application.
A general stagnation-point convective heating equation for arbitrary gas mixtures
NASA Technical Reports Server (NTRS)
Sutton, K.; Graves, R. A., Jr.
1971-01-01
The stagnation-point convective heat transfer to an axisymmetric blunt body for arbitrary gases in chemical equilibrium was investigated. The gases considered were base gases of nitrogen, oxygen, hydrogen, helium, neon, argon, carbon dioxide, ammonia, and methane and 22 gas mixtures composed of the base gases. Enthalpies ranged from 2.3 to 116.2 MJ/kg, pressures ranged from 0.001 to 100 atmospheres, and the wall temperatures were 300 and 1111 K. A general equation for the stagnation-point convective heat transfer in base gases and gas mixtures was derived and is a function of the mass fraction, the molecular weight, and a transport parameter of the base gases. The relation compares well with present boundary-layer computer results and with other analytical and experimental results. In addition, the analysis verified that the convective heat transfer in gas mixtures can be determined from a summation relation involving the heat transfer coefficients of the base gases. The basic technique developed for the prediction of stagnation-point convective heating to an axisymmetric blunt body could be applied to other heat transfer problems.
NASA Technical Reports Server (NTRS)
Lovelace, Jeffrey J.; Cios, Krzysztof J.; Roth, Don J.; Cao, Wei
2000-01-01
Post-Scan Interactive Data Display (PSIDD) III is a user-oriented Windows-based system that facilitates the display and comparison of ultrasonic contact data. The system is optimized to compare ultrasonic measurements made at different locations within a material or at different stages of material degradation. PSIDD III provides complete analysis of the primary wave forms in the time and frequency domains along with the calculation of several frequency dependent properties including Phase Velocity and Attenuation Coefficient and several frequency independent properties, like the Cross Correlation Velocity. The system allows image generation on all of the frequency dependent properties at any available frequency (limited by the bandwidth used in the scans) and on any of the frequency independent properties. From ultrasonic contact scans, areas of interest on an image can be studied with regard to underlying raw waveforms and derived ultrasonic properties by simply selecting the point on the image. The system offers various modes of in-depth comparison between scan points. Up to five scan points can be selected for comparative analysis at once. The system was developed with Borland Delphi software (Visual Pascal) and is based on a SQL database. It is ideal for classification of material properties, or location of microstructure variations in materials.
Performance analysis of a dual-tree algorithm for computing spatial distance histograms
Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni
2011-01-01
Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753
A proposed method for world weightlifting championships team selection.
Chiu, Loren Z F
2009-08-01
The caliber of competitors at the World Weightlifting Championships (WWC) has increased greatly over the past 20 years. As the WWC are the primary qualifiers for Olympic slots (1996 to present), it is imperative for a nation to select team members who will finish with a high placing and score team points. Previous selection methods were based on a simple percentage system. Analysis of the results from the 2006 and 2007 WWC indicates a curvilinear trend in each weight class, suggesting a simple percentage system will not maximize the number of team points earned. To maximize team points, weightlifters should be selected based on their potential to finish in the top 25. A 5-tier ranking system is proposed that should ensure the athletes with the greatest potential to score team points are selected.
Tipping point analysis of ocean acoustic noise
NASA Astrophysics Data System (ADS)
Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen
2018-02-01
We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.
NASA Astrophysics Data System (ADS)
Javadi, Maryam; Shahrabi, Jamal
2014-03-01
The problems of facility location and the allocation of demand points to facilities are crucial research issues in spatial data analysis and urban planning. It is very important for an organization or governments to best locate its resources and facilities and efficiently manage resources to ensure that all demand points are covered and all the needs are met. Most of the recent studies, which focused on solving facility location problems by performing spatial clustering, have used the Euclidean distance between two points as the dissimilarity function. Natural obstacles, such as mountains and rivers, can have drastic impacts on the distance that needs to be traveled between two geographical locations. While calculating the distance between various supply chain entities (including facilities and demand points), it is necessary to take such obstacles into account to obtain better and more realistic results regarding location-allocation. In this article, new models were presented for location of urban facilities while considering geographical obstacles at the same time. In these models, three new distance functions were proposed. The first function was based on the analysis of shortest path in linear network, which was called SPD function. The other two functions, namely PD and P2D, were based on the algorithms that deal with robot geometry and route-based robot navigation in the presence of obstacles. The models were implemented in ArcGIS Desktop 9.2 software using the visual basic programming language. These models were evaluated using synthetic and real data sets. The overall performance was evaluated based on the sum of distance from demand points to their corresponding facilities. Because of the distance between the demand points and facilities becoming more realistic in the proposed functions, results indicated desired quality of the proposed models in terms of quality of allocating points to centers and logistic cost. Obtained results show promising improvements of the allocation, the logistics costs and the response time. It can also be inferred from this study that the P2D-based model and the SPD-based model yield similar results in terms of the facility location and the demand allocation. It is noted that the P2D-based model showed better execution time than the SPD-based model. Considering logistic costs, facility location and response time, the P2D-based model was appropriate choice for urban facility location problem considering the geographical obstacles.
Automated analysis of plethysmograms for functional studies of hemodynamics
NASA Astrophysics Data System (ADS)
Zatrudina, R. Sh.; Isupov, I. B.; Gribkov, V. Yu.
2018-04-01
The most promising method for the quantitative determination of cardiovascular tone indicators and of cerebral hemodynamics indicators is the method of impedance plethysmography. The accurate determination of these indicators requires the correct identification of the characteristic points in the thoracic impedance plethysmogram and the cranial impedance plethysmogram respectively. An algorithm for automatic analysis of these plethysmogram is presented. The algorithm is based on the hard temporal relationships between the phases of the cardiac cycle and the characteristic points of the plethysmogram. The proposed algorithm does not require estimation of initial data and selection of processing parameters. Use of the method on healthy subjects showed a very low detection error of characteristic points.
NASA Astrophysics Data System (ADS)
Fernandez Galarreta, J.; Kerle, N.; Gerke, M.
2015-06-01
Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.
System-based strategies for p53 recovery.
Azam, Muhammad Rizwan; Fazal, Sahar; Ullah, Mukhtar; Bhatti, Aamer I
2018-06-01
The authors have proposed a systems theory-based novel drug design approach for the p53 pathway. The pathway is taken as a dynamic system represented by ordinary differential equations-based mathematical model. Using control engineering practices, the system analysis and subsequent controller design is performed for the re-activation of wild-type p53. p53 revival is discussed for both modes of operation, i.e. the sustained and oscillatory. To define the problem in control system paradigm, modification in the existing mathematical model is performed to incorporate the effect of Nutlin. Attractor point analysis is carried out to select the suitable domain of attraction. A two-loop negative feedback control strategy is devised to drag the system trajectories to the attractor point and to regulate cellular concentration of Nutlin, respectively. An integrated framework is constituted to incorporate the pharmacokinetic effects of Nutlin in the cancerous cells. Bifurcation analysis is also performed on the p53 model to see the conditions for p53 oscillation.
On the distribution of saliency.
Berengolts, Alexander; Lindenbaum, Michael
2006-12-01
Detecting salient structures is a basic task in perceptual organization. Saliency algorithms typically mark edge-points with some saliency measure, which grows with the length and smoothness of the curve on which these edge-points lie. Here, we propose a modified saliency estimation mechanism that is based on probabilistically specified grouping cues and on curve length distributions. In this framework, the Shashua and Ullman saliency mechanism may be interpreted as a process for detecting the curve with maximal expected length. Generalized types of saliency naturally follow. We propose several specific generalizations (e.g., gray-level-based saliency) and rigorously derive the limitations on generalized saliency types. We then carry out a probabilistic analysis of expected length saliencies. Using ergodicity and asymptotic analysis, we derive the saliency distributions associated with the main curves and with the rest of the image. We then extend this analysis to finite-length curves. Using the derived distributions, we derive the optimal threshold on the saliency for discriminating between figure and background and bound the saliency-based figure-from-ground performance.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
GENERAL: Bursting Ca2+ Oscillations and Synchronization in Coupled Cells
NASA Astrophysics Data System (ADS)
Ji, Quan-Bao; Lu, Qi-Shao; Yang, Zhuo-Qin; Duan, Li-Xia
2008-11-01
A mathematical model proposed by Grubelnk et al. [Biophys. Chew,. 94 (2001) 59] is employed to study the physiological role of mitochondria and the cytosolic proteins in generating complex Ca2+ oscillations. Intracel-lular bursting calcium oscillations of point-point, point-cycle and two-folded limit cycle types are observed and explanations are given based on the fast/slow dynamical analysis, especially for point-cycle and two-folded limit cycle types, which have not been reported before. Furthermore, synchronization of coupled bursters of Ca2+ oscillations via gap junctions and the effect of bursting types on synchronization of coupled cells are studied. It is argued that bursting oscillations of point-point type may be superior to achieve synchronization than that of point-cycle type.
Filtering Airborne LIDAR Data by AN Improved Morphological Method Based on Multi-Gradient Analysis
NASA Astrophysics Data System (ADS)
Li, Y.
2013-05-01
The technology of airborne Light Detection And Ranging (LIDAR) is capable of acquiring dense and accurate 3D geospatial data. Although many related efforts have been made by a lot of researchers in the last few years, LIDAR data filtering is still a challenging task, especially for area with high relief or hybrid geographic features. In order to address the bare-ground extraction from LIDAR point clouds of complex landscapes, a novel morphological filtering algorithm is proposed based on multi-gradient analysis in terms of the characteristic of LIDAR data distribution in this paper. Firstly, point clouds are organized by an index mesh. Then, the multigradient of each point is calculated using the morphological method. And, objects are removed gradually by choosing some points to carry on an improved opening operation constrained by multi-gradient iteratively. 15 sample data provided by ISPRS Working Group III/3 are employed to test the filtering algorithm proposed. These sample data include those environments that may lead to filtering difficulty. Experimental results show that filtering algorithm proposed by this paper is of high adaptability to various scenes including urban and rural areas. Omission error, commission error and total error can be simultaneously controlled in a relatively small interval. This algorithm can efficiently remove object points while preserves ground points to a great degree.
Reconstruction and analysis of hybrid composite shells using meshless methods
NASA Astrophysics Data System (ADS)
Bernardo, G. M. S.; Loja, M. A. R.
2017-06-01
The importance of focusing on the research of viable models to predict the behaviour of structures which may possess in some cases complex geometries is an issue that is growing in different scientific areas, ranging from the civil and mechanical engineering to the architecture or biomedical devices fields. In these cases, the research effort to find an efficient approach to fit laser scanning point clouds, to the desired surface, has been increasing, leading to the possibility of modelling as-built/as-is structures and components' features. However, combining the task of surface reconstruction and the implementation of a structural analysis model is not a trivial task. Although there are works focusing those different phases in separate, there is still an effective need to find approaches able to interconnect them in an efficient way. Therefore, achieving a representative geometric model able to be subsequently submitted to a structural analysis in a similar based platform is a fundamental step to establish an effective expeditious processing workflow. With the present work, one presents an integrated methodology based on the use of meshless approaches, to reconstruct shells described by points' clouds, and to subsequently predict their static behaviour. These methods are highly appropriate on dealing with unstructured points clouds, as they do not need to have any specific spatial or geometric requirement when implemented, depending only on the distance between the points. Details on the formulation, and a set of illustrative examples focusing the reconstruction of cylindrical and double-curvature shells, and its further analysis, are presented.
Error analysis in stereo vision for location measurement of 3D point
NASA Astrophysics Data System (ADS)
Li, Yunting; Zhang, Jun; Tian, Jinwen
2015-12-01
Location measurement of 3D point in stereo vision is subjected to different sources of uncertainty that propagate to the final result. For current methods of error analysis, most of them are based on ideal intersection model to calculate the uncertainty region of point location via intersecting two fields of view of pixel that may produce loose bounds. Besides, only a few of sources of error such as pixel error or camera position are taken into account in the process of analysis. In this paper we present a straightforward and available method to estimate the location error that is taken most of source of error into account. We summed up and simplified all the input errors to five parameters by rotation transformation. Then we use the fast algorithm of midpoint method to deduce the mathematical relationships between target point and the parameters. Thus, the expectations and covariance matrix of 3D point location would be obtained, which can constitute the uncertainty region of point location. Afterwards, we turned back to the error propagation of the primitive input errors in the stereo system and throughout the whole analysis process from primitive input errors to localization error. Our method has the same level of computational complexity as the state-of-the-art method. Finally, extensive experiments are performed to verify the performance of our methods.
Eubanks-Carter, Catherine; Gorman, Bernard S; Muran, J Christopher
2012-01-01
Analysis of change points in psychotherapy process could increase our understanding of mechanisms of change. In particular, naturalistic change point detection methods that identify turning points or breakpoints in time series data could enhance our ability to identify and study alliance ruptures and resolutions. This paper presents four categories of statistical methods for detecting change points in psychotherapy process: criterion-based methods, control chart methods, partitioning methods, and regression methods. Each method's utility for identifying shifts in the alliance is illustrated using a case example from the Beth Israel Psychotherapy Research program. Advantages and disadvantages of the various methods are discussed.
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
Point source detection in infrared astronomical surveys
NASA Technical Reports Server (NTRS)
Pelzmann, R. F., Jr.
1977-01-01
Data processing techniques useful for infrared astronomy data analysis systems are reported. This investigation is restricted to consideration of data from space-based telescope systems operating as survey instruments. In this report the theoretical background for specific point-source detection schemes is completed, and the development of specific algorithms and software for the broad range of requirements is begun.
Data-Based Decision Making: The Impact of Data Variability, Training, and Context
ERIC Educational Resources Information Center
Vanselow, Nicholas R.; Thompson, Rachel; Karsina, Allen
2011-01-01
The current study examines agreement among individuals with varying expertise in behavior analysis about the length of baseline when data were presented point by point. Participants were asked to respond to baseline data and to indicate when to terminate the baseline phase. When only minimal information was provided about the data set, experts and…
Popova, A Yu; Trukhina, G M; Mikailova, O M
In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.
Complementing Operating Room Teaching With Video-Based Coaching.
Hu, Yue-Yung; Mazer, Laura M; Yule, Steven J; Arriaga, Alexander F; Greenberg, Caprice C; Lipsitz, Stuart R; Gawande, Atul A; Smink, Douglas S
2017-04-01
Surgical expertise demands technical and nontechnical skills. Traditionally, surgical trainees acquired these skills in the operating room; however, operative time for residents has decreased with duty hour restrictions. As in other professions, video analysis may help maximize the learning experience. To develop and evaluate a postoperative video-based coaching intervention for residents. In this mixed methods analysis, 10 senior (postgraduate year 4 and 5) residents were videorecorded operating with an attending surgeon at an academic tertiary care hospital. Each video formed the basis of a 1-hour one-on-one coaching session conducted by the operative attending; although a coaching framework was provided, participants determined the specific content collaboratively. Teaching points were identified in the operating room and the video-based coaching sessions; iterative inductive coding, followed by thematic analysis, was performed. Teaching points made in the operating room were compared with those in the video-based coaching sessions with respect to initiator, content, and teaching technique, adjusting for time. Among 10 cases, surgeons made more teaching points per unit time (63.0 vs 102.7 per hour) while coaching. Teaching in the video-based coaching sessions was more resident centered; attendings were more inquisitive about residents' learning needs (3.30 vs 0.28, P = .04), and residents took more initiative to direct their education (27% [198 of 729 teaching points] vs 17% [331 of 1977 teaching points], P < .001). Surgeons also more frequently validated residents' experiences (8.40 vs 1.81, P < .01), and they tended to ask more questions to promote critical thinking (9.30 vs 3.32, P = .07) and set more learning goals (2.90 vs 0.28, P = .11). More complex topics, including intraoperative decision making (mean, 9.70 vs 2.77 instances per hour, P = .03) and failure to progress (mean, 1.20 vs 0.13 instances per hour, P = .04) were addressed, and they were more thoroughly developed and explored. Excerpts of dialogue are presented to illustrate these findings. Video-based coaching is a novel and feasible modality for supplementing intraoperative learning. Objective evaluation demonstrates that video-based coaching may be particularly useful for teaching higher-level concepts, such as decision making, and for individualizing instruction and feedback to each resident.
Model Documentation of Base Case Data | Regional Energy Deployment System
Model | Energy Analysis | NREL Documentation of Base Case Data Model Documentation of Base Case base case of the model. The base case was developed simply as a point of departure for other analyses Base Case derives many of its inputs from the Energy Information Administration's (EIA's) Annual Energy
RADIOISOTOPES USED IN PHARMACY. 5. IONIZING RADIATION IN PHARMACEUTICAL ANALYSIS (in Danish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristensen, K.
1962-09-01
The use of radioisotope methods for analyzing drugs is reviewed. It is pointed out that heretofore most methods have been based on isotope dilution principles whereas in the future radioactivation analysis, especially with neutron sources, offers great possibilities. (BBB)
State Analysis: A Control Architecture View of Systems Engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert D.
2005-01-01
A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.
Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.
Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang
2015-01-01
As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.
Duncan, Michael J; Eyre, Emma L J; Bryant, Elizabeth; Birch, Samantha L
2014-01-01
Evidence-based pedometer cut-points for health have not been sufficiently examined in the context of ethnicity. To (1) evaluate previously described steps/day cut-points in a sample of White and South Asian British primary school children and (2) use ROC analysis to generate alternative, ethnic specific, steps/day cut-offs for children. Height, body mass and pedometer determined physical activity were assessed in 763 British children (357 boys and 406 girls) from White (n = 593) and South Asian (n = 170) ethnic groups, aged 8-11 years. The Vincent and Pangrazi cut-points significantly predicted BMI in white (p = 0.006, Adjusted R(2 )= 0.08) and South Asian children (p = 0.039, Adjusted R(2 )= 0.078). The Tudor-Locke et al. cut-points significantly predicted BMI in White children (p = 0.0001, Adjusted R(2 )= 0.079) but not South Asian children (p < 0.05). ROC analysis indicated significant alternative cut-points in White and South Asian boys and girls (all p = 0.04 or better, Adjusted R(2 )= 0.091 for White and 0.09 for South Asian children). Subsequent cut-points associated with healthy weight, when translated to steps/day were 13,625 for White boys, 13,135 for White girls, 10,897 for South Asian boys and 10,161 for South Asian girls. Previously published steps/day cut-points for healthy weight may not account for known ethnic variation in physical activity between White and South Asian children in the UK. Alternative, ethnic-specific, cut-points may be better placed to distinguish British children based on pedometer-determined physical activity.
The Feasibility of 3d Point Cloud Generation from Smartphones
NASA Astrophysics Data System (ADS)
Alsubaie, N.; El-Sheimy, N.
2016-06-01
This paper proposes a new technique for increasing the accuracy of direct geo-referenced image-based 3D point cloud generated from low-cost sensors in smartphones. The smartphone's motion sensors are used to directly acquire the Exterior Orientation Parameters (EOPs) of the captured images. These EOPs, along with the Interior Orientation Parameters (IOPs) of the camera/ phone, are used to reconstruct the image-based 3D point cloud. However, because smartphone motion sensors suffer from poor GPS accuracy, accumulated drift and high signal noise, inaccurate 3D mapping solutions often result. Therefore, horizontal and vertical linear features, visible in each image, are extracted and used as constraints in the bundle adjustment procedure. These constraints correct the relative position and orientation of the 3D mapping solution. Once the enhanced EOPs are estimated, the semi-global matching algorithm (SGM) is used to generate the image-based dense 3D point cloud. Statistical analysis and assessment are implemented herein, in order to demonstrate the feasibility of 3D point cloud generation from the consumer-grade sensors in smartphones.
Sharpe, J Danielle; Hopkins, Richard S; Cook, Robert L; Striley, Catherine W
2016-10-20
Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC's change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package "bcp" version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for Google was 85%. A low sensitivity of 50% was calculated for Twitter; a low PPV of 43% was found for Twitter also. Wikipedia had the lowest sensitivity of 33% and lowest PPV of 40%. Of the 3 Web-based sources, Google had the best combination of sensitivity and PPV in detecting Bayesian change points in influenza-related data streams. Findings demonstrated that change points in Google, Twitter, and Wikipedia data occasionally aligned well with change points captured in CDC ILI data, yet these sources did not detect all changes in CDC data and should be further studied and developed.
Simulation of stochastic wind action on transmission power lines
NASA Astrophysics Data System (ADS)
Wielgos, Piotr; Lipecki, Tomasz; Flaga, Andrzej
2018-01-01
The paper presents FEM analysis of the wind action on overhead transmission power lines. The wind action is based on a stochastic simulation of the wind field in several points of the structure and on the wind tunnel tests on aerodynamic coefficients of the single conductor consisting of three wires. In FEM calculations the section of the transmission power line composed of three spans is considered. Non-linear analysis with deadweight of the structure is performed first to obtain the deformed shape of conductors. Next, time-dependent wind forces are applied to respective points of conductors and non-linear dynamic analysis is carried out.
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.
Pang, Xufang; Song, Zhan; Xie, Wuyuan
2013-01-01
3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.
NASA Astrophysics Data System (ADS)
Xiang, Changle; Liu, Feng; Liu, Hui; Han, Lijin; Zhang, Xun
2016-06-01
Unbalanced magnetic pull (UMP) plays a key role in nonlinear dynamic behaviors of permanent magnet synchronous motors (PMSM) in electric vehicles. Based on Jeffcott rotor model, the stiffness characteristics of the rotor system of the PMSM are analyzed and the nonlinear dynamic behaviors influenced by UMP are investigated. In free vibration study, eigenvalue-based stability analysis for multiple equilibrium points is performed which offers an insight in system stiffness. Amplitude modulation effects are discovered of which the mechanism is explained and the period of modulating signal is carried out by phase analysis and averaging method. The analysis indicates that the effects are caused by the interaction of the initial phases of forward and backward whirling motions. In forced vibration study, considering dynamic eccentricity, frequency characteristics revealing softening type are obtained by harmonic balance method, and the stability of periodic solution is investigated by Routh-Hurwitz criterion. The frequency characteristics analysis indicates that the response amplitude is limited in the range between the amplitudes of the two kinds of equilibrium points. In the vicinity of the continuum of equilibrium points, the system hardly provides resistance to bending, and hence external disturbances easily cause loss of stability. It is useful for the design of the PMSM with high stability and low vibration and acoustic noise.
Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso
2017-01-01
Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood. PMID:28294963
NASA Technical Reports Server (NTRS)
Woods-Vedeler, Jessica A.; Rombado, Gabriel
1997-01-01
The purpose of this paper is to provide final results of a pointing stability analysis for external payload attachment sites (PAS) on the International Space Station (ISS). As a specific example, the pointing stability requirement of the SAGE III atmospheric science instrument was examined in this paper. The instrument requires 10 arcsec stability over 2 second periods. SAGE 3 will be mounted on the ISS starboard side at the lower, outboard FIAS. In this engineering analysis, an open-loop DAC-3 finite element model of ISS was used by the Microgravity Group at Johnson Space Flight Center to generate transient responses at PAS to a limited number of disturbances. The model included dynamics up to 50 Hz. Disturbance models considered included operation of the solar array rotary joints, thermal radiator rotary joints, and control moment gyros. Responses were filtered to model the anticipated vibration attenuation effects of active control systems on the solar and thermal radiator rotary joints. A pointing stability analysis was conducted by double integrating acceleration transient over a 2 second period. Results of the analysis are tabulated for ISS X, Y, and Z Axis rotations. These results indicate that the largest excursions in rotation during pointing occurred due to rapid slewing of the thermal radiator. Even without attenuation at the rotary joints, the resulting pointing error was limited to less than 1.6 arcsec. With vibration control at the joints, to a maximum 0.5 arcsec over a 2 second period. Based on this current level of model definition, it was concluded that between 0 - 50 Hz, the pointing stability requirement for SAGE 3 will not be exceeded by the disturbances evaluated in this study.
A multiple-point spatially weighted k-NN method for object-based classification
NASA Astrophysics Data System (ADS)
Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.
2016-10-01
Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.
NASA Astrophysics Data System (ADS)
Nishimura, Takahiro; Kimura, Hitoshi; Ogura, Yusuke; Tanida, Jun
2018-06-01
This paper presents an experimental assessment and analysis of super-resolution microscopy based on multiple-point spread function fitting of spectrally demultiplexed images using a designed DNA structure as a test target. For the purpose, a DNA structure was designed to have binding sites at a certain interval that is smaller than the diffraction limit. The structure was labeled with several types of quantum dots (QDs) to acquire their spatial information as spectrally encoded images. The obtained images are analyzed with a point spread function multifitting algorithm to determine the QD locations that indicate the binding site positions. The experimental results show that the labeled locations can be observed beyond the diffraction-limited resolution using three-colored fluorescence images that were obtained with a confocal fluorescence microscope. Numerical simulations show that labeling with eight types of QDs enables the positions aligned at 27.2-nm pitches on the DNA structure to be resolved with high accuracy.
Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don
2015-10-01
Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Cryogenic Tank Structure Sizing With Structural Optimization Method
NASA Technical Reports Server (NTRS)
Wang, J. T.; Johnson, T. F.; Sleight, D. W.; Saether, E.
2001-01-01
Structural optimization methods in MSC /NASTRAN are used to size substructures and to reduce the weight of a composite sandwich cryogenic tank for future launch vehicles. Because the feasible design space of this problem is non-convex, many local minima are found. This non-convex problem is investigated in detail by conducting a series of analyses along a design line connecting two feasible designs. Strain constraint violations occur for some design points along the design line. Since MSC/NASTRAN uses gradient-based optimization procedures. it does not guarantee that the lowest weight design can be found. In this study, a simple procedure is introduced to create a new starting point based on design variable values from previous optimization analyses. Optimization analysis using this new starting point can produce a lower weight design. Detailed inputs for setting up the MSC/NASTRAN optimization analysis and final tank design results are presented in this paper. Approaches for obtaining further weight reductions are also discussed.
Lorenz, Matthias W.; Bickel, Horst; Bots, Michiel L.; Breteler, Monique M.B.; Catapano, Alberico L.; Desvarieux, Moise; Hedblad, Bo; Iglseder, Bernhard; Johnsen, Stein Harald; Juraska, Michal; Kiechl, Stefan; Mathiesen, Ellisiv B.; Norata, Giuseppe D.; Grigore, Liliana; Polak, Joseph; Poppert, Holger; Rosvall, Maria; Rundek, Tatjana; Sacco, Ralph L.; Sander, Dirk; Sitzer, Matthias; Steinmetz, Helmuth; Stensland, Eva; Willeit, Johann; Witteman, Jacqueline; Yanez, David; Thompson, Simon G.
2013-01-01
Carotid intima media thickness (IMT) progression is increasingly used as a surrogate for vascular risk. This use is supported by data from a few clinical trials investigating statins, but established criteria of surrogacy are only partially fulfilled. To provide a valid basis for the use of IMT progression as a study end point, we are performing a 3-step meta-analysis project based on individual participant data. Objectives of the 3 successive stages are to investigate (1) whether IMT progression prospectively predicts myocardial infarction, stroke, or death in population-based samples; (2) whether it does so in prevalent disease cohorts; and (3) whether interventions affecting IMT progression predict a therapeutic effect on clinical end points. Recruitment strategies, inclusion criteria, and estimates of the expected numbers of eligible studies are presented along with a detailed analysis plan. PMID:20435179
NASA Astrophysics Data System (ADS)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
Zhang, Jiachao; Hu, Qisong; Xu, Chuanbiao; Liu, Sixin; Li, Congfa
2016-01-01
Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry.
Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling
Xu, Chuanbiao; Liu, Sixin; Li, Congfa
2016-01-01
Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry. PMID:27768750
Global point signature for shape analysis of carpal bones
NASA Astrophysics Data System (ADS)
Chaudhari, Abhijit J.; Leahy, Richard M.; Wise, Barton L.; Lane, Nancy E.; Badawi, Ramsey D.; Joshi, Anand A.
2014-02-01
We present a method based on spectral theory for the shape analysis of carpal bones of the human wrist. We represent the cortical surface of the carpal bone in a coordinate system based on the eigensystem of the two-dimensional Helmholtz equation. We employ a metric—global point signature (GPS)—that exploits the scale and isometric invariance of eigenfunctions to quantify overall bone shape. We use a fast finite-element-method to compute the GPS metric. We capitalize upon the properties of GPS representation—such as stability, a standard Euclidean (ℓ2) metric definition, and invariance to scaling, translation and rotation—to perform shape analysis of the carpal bones of ten women and ten men from a publicly-available database. We demonstrate the utility of the proposed GPS representation to provide a means for comparing shapes of the carpal bones across populations.
NASA Astrophysics Data System (ADS)
Kharkar, Prashant S.; Reith, Maarten E. A.; Dutta, Aloke K.
2008-01-01
Three-dimensional quantitative structure-activity relationship (3D QSAR) using comparative molecular field analysis (CoMFA) was performed on a series of substituted tetrahydropyran (THP) derivatives possessing serotonin (SERT) and norepinephrine (NET) transporter inhibitory activities. The study aimed to rationalize the potency of these inhibitors for SERT and NET as well as the observed selectivity differences for NET over SERT. The dataset consisted of 29 molecules, of which 23 molecules were used as the training set for deriving CoMFA models for SERT and NET uptake inhibitory activities. Superimpositions were performed using atom-based fitting and 3-point pharmacophore-based alignment. Two charge calculation methods, Gasteiger-Hückel and semiempirical PM3, were tried. Both alignment methods were analyzed in terms of their predictive abilities and produced comparable results with high internal and external predictivities. The models obtained using the 3-point pharmacophore-based alignment outperformed the models with atom-based fitting in terms of relevant statistics and interpretability of the generated contour maps. Steric fields dominated electrostatic fields in terms of contribution. The selectivity analysis (NET over SERT), though yielded models with good internal predictivity, showed very poor external test set predictions. The analysis was repeated with 24 molecules after systematically excluding so-called outliers (5 out of 29) from the model derivation process. The resulting CoMFA model using the atom-based fitting exhibited good statistics and was able to explain most of the selectivity (NET over SERT)-discriminating factors. The presence of -OH substituent on the THP ring was found to be one of the most important factors governing the NET selectivity over SERT. Thus, a 4-point NET-selective pharmacophore, after introducing this newly found H-bond donor/acceptor feature in addition to the initial 3-point pharmacophore, was proposed.
NASA Astrophysics Data System (ADS)
Preusker, Frank; Scholten, Frank; Matz, Klaus-Dieter; Roatsch, Thomas; Willner, Konrad; Hviid, Stubbe; Knollenberg, Jörg; Kührt, Ekkehard; Sierks, Holger
2015-04-01
The European Space Agency's Rosetta spacecraft is equipped with the OSIRIS imaging system which consists of a wide-angle and a narrow-angle camera (WAC and NAC). After the approach phase, Rosetta was inserted into a descent trajectory of comet 67P/Churyumov-Gerasimenko (C-G) in early August 2014. Until early September, OSIRIS acquired several hundred NAC images of C-G's surface at different scales (from ~5 m/pixel during approach to ~0.9 m/pixel during descent). In that one month observation period, the surface was imaged several times within different mapping sequences. With the comet's rotation period of ~12.4 h and the low spacecraft velocity (< 1 m/s), the entire NAC dataset provides multiple NAC stereo coverage, adequate for stereo-photogrammetric (SPG) analysis towards the derivation of 3D surface models. We constrained the OSIRIS NAC images with our stereo requirements (15° < stereo angles < 45°, incidence angles <85°, emission angles <45°, differences in illumination < 10°, scale better than 5 m/pixel) and extracted about 220 NAC images that provide at least triple stereo image coverage for the entire illuminated surface in about 250 independent multi-stereo image combinations. For each image combination we determined tie points by multi-image matching in order to set-up a 3D control network and a dense surface point cloud for the precise reconstruction of C-G's shape. The control point network defines the input for a stereo-photogrammetric least squares adjustment. Based on the statistical analysis of adjustments we first refined C-G's rotational state (pole orientation and rotational period) and its behavior over time. Based upon this description of the orientation of C-G's body-fixed reference frame, we derived corrections for the nominal navigation data (pointing and position) within a final stereo-photogrammetric block adjustment where the mean 3D point accuracy of more than 100 million surface points has been improved from ~10 m to the sub-meter range. We finally applied point filtering and interpolation techniques to these surface 3D points and show the resulting SPG-based 3D surface model with a lateral sampling rate of about 2 m.
Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger
2013-01-01
A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.
Durable Hybrid Coatings. Annual Performance Report (2008)
2008-09-01
points 162ensuring stabilization of the reading before moving to the next point. 163Two different thermogravimetric analysis (TGA) methods were...aluminum alloy (Al 2024). Mg-rich primers based on a hybrid organic-inorganic binder derived from silica nanoparticles and...phenethyltrimethoxysilane gave excellent corrosion protection of Al 2024-T3. Work has continued on these coatings with particular emphasis on the silica nanoparticle
Person Fit Analysis in Computerized Adaptive Testing Using Tests for a Change Point
ERIC Educational Resources Information Center
Sinharay, Sandip
2016-01-01
Meijer and van Krimpen-Stoop noted that the number of person-fit statistics (PFSs) that have been designed for computerized adaptive tests (CATs) is relatively modest. This article partially addresses that concern by suggesting three new PFSs for CATs. The statistics are based on tests for a change point and can be used to detect an abrupt change…
Latour, Ewa; Latour, Marek; Arlet, Jarosław; Adach, Zdzisław; Bohatyrewicz, Andrzej
2011-07-01
Analysis of pedobarographical data requires geometric identification of specific anatomical areas extracted from recorded plantar pressures. This approach has led to ambiguity in measurements that may underlie the inconsistency of conclusions reported in pedobarographical studies. The goal of this study was to design a new analysis method less susceptible to the projection accuracy of anthropometric points and distance estimation, based on rarely used spatio-temporal indices. Six pedobarographic records per person (three per foot) from a group of 60 children aged 11-12 years were obtained and analyzed. The basis of the analysis was a mutual relationship between two spatio-temporal indices created by excursion of the peak pressure point and the center-of-pressure point on the dynamic pedobarogram. Classification of weight-shift patterns was elaborated and performed, and their frequencies of occurrence were assessed. This new method allows an assessment of body weight shift through the plantar pressure surface based on distribution analysis of spatio-temporal indices not affected by the shape of this surface. Analysis of the distribution of the created index confirmed the existence of typical ways of weight shifting through the plantar surface of the foot during gait, as well as large variability of the intrasubject occurrence. This method may serve as the basis for interpretation of foot functional features and may extend the clinical usefulness of pedobarography. Copyright © 2011 Elsevier B.V. All rights reserved.
Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis
NASA Astrophysics Data System (ADS)
Che, E.; Olsen, M. J.
2017-09-01
Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.
Dynamic Analysis of the Melanoma Model: From Cancer Persistence to Its Eradication
NASA Astrophysics Data System (ADS)
Starkov, Konstantin E.; Jimenez Beristain, Laura
In this paper, we study the global dynamics of the five-dimensional melanoma model developed by Kronik et al. This model describes interactions of tumor cells with cytotoxic T cells and respective cytokines under cellular immunotherapy. We get the ultimate upper and lower bounds for variables of this model, provide formulas for equilibrium points and present local asymptotic stability/hyperbolic instability conditions. Next, we prove the existence of the attracting set. Based on these results we come to global asymptotic melanoma eradication conditions via global stability analysis. Finally, we provide bounds for a locus of the melanoma persistence equilibrium point, study the case of melanoma persistence and describe conditions under which we observe global attractivity to the unique melanoma persistence equilibrium point.
Probabilistic peak detection in CE-LIF for STR DNA typing.
Woldegebriel, Michael; van Asten, Arian; Kloosterman, Ate; Vivó-Truyols, Gabriel
2017-07-01
In this work, we present a novel probabilistic peak detection algorithm based on a Bayesian framework for forensic DNA analysis. The proposed method aims at an exhaustive use of raw electropherogram data from a laser-induced fluorescence multi-CE system. As the raw data are informative up to a single data point, the conventional threshold-based approaches discard relevant forensic information early in the data analysis pipeline. Our proposed method assigns a posterior probability reflecting the data point's relevance with respect to peak detection criteria. Peaks of low intensity generated from a truly existing allele can thus constitute evidential value instead of fully discarding them and contemplating a potential allele drop-out. This way of working utilizes the information available within each individual data point and thus avoids making early (binary) decisions on the data analysis that can lead to error propagation. The proposed method was tested and compared to the application of a set threshold as is current practice in forensic STR DNA profiling. The new method was found to yield a significant improvement in the number of alleles identified, regardless of the peak heights and deviation from Gaussian shape. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Contour matching for a fish recognition and migration-monitoring system
NASA Astrophysics Data System (ADS)
Lee, Dah-Jye; Schoenberger, Robert B.; Shiozawa, Dennis; Xu, Xiaoqian; Zhan, Pengcheng
2004-12-01
Fish migration is being monitored year round to provide valuable information for the study of behavioral responses of fish to environmental variations. However, currently all monitoring is done by human observers. An automatic fish recognition and migration monitoring system is more efficient and can provide more accurate data. Such a system includes automatic fish image acquisition, contour extraction, fish categorization, and data storage. Shape is a very important characteristic and shape analysis and shape matching are studied for fish recognition. Previous work focused on finding critical landmark points on fish shape using curvature function analysis. Fish recognition based on landmark points has shown satisfying results. However, the main difficulty of this approach is that landmark points sometimes cannot be located very accurately. Whole shape matching is used for fish recognition in this paper. Several shape descriptors, such as Fourier descriptors, polygon approximation and line segments, are tested. A power cepstrum technique has been developed in order to improve the categorization speed using contours represented in tangent space with normalized length. Design and integration including image acquisition, contour extraction and fish categorization are discussed in this paper. Fish categorization results based on shape analysis and shape matching are also included.
High‐resolution trench photomosaics from image‐based modeling: Workflow and error analysis
Reitman, Nadine G.; Bennett, Scott E. K.; Gold, Ryan D.; Briggs, Richard; Duross, Christopher
2015-01-01
Photomosaics are commonly used to construct maps of paleoseismic trench exposures, but the conventional process of manually using image‐editing software is time consuming and produces undesirable artifacts and distortions. Herein, we document and evaluate the application of image‐based modeling (IBM) for creating photomosaics and 3D models of paleoseismic trench exposures, illustrated with a case‐study trench across the Wasatch fault in Alpine, Utah. Our results include a structure‐from‐motion workflow for the semiautomated creation of seamless, high‐resolution photomosaics designed for rapid implementation in a field setting. Compared with conventional manual methods, the IBM photomosaic method provides a more accurate, continuous, and detailed record of paleoseismic trench exposures in approximately half the processing time and 15%–20% of the user input time. Our error analysis quantifies the effect of the number and spatial distribution of control points on model accuracy. For this case study, an ∼87 m2 exposure of a benched trench photographed at viewing distances of 1.5–7 m yields a model with <2 cm root mean square error (rmse) with as few as six control points. Rmse decreases as more control points are implemented, but the gains in accuracy are minimal beyond 12 control points. Spreading control points throughout the target area helps to minimize error. We propose that 3D digital models and corresponding photomosaics should be standard practice in paleoseismic exposure archiving. The error analysis serves as a guide for future investigations that seek balance between speed and accuracy during photomosaic and 3D model construction.
Automated search of control points in surface-based morphometry.
Canna, Antonietta; Russo, Andrea G; Ponticorvo, Sara; Manara, Renzo; Pepino, Alessandro; Sansone, Mario; Di Salle, Francesco; Esposito, Fabrizio
2018-04-16
Cortical surface-based morphometry is based on a semi-automated analysis of structural MRI images. In FreeSurfer, a widespread tool for surface-based analyses, a visual check of gray-white matter borders is followed by the manual placement of control points to drive the topological correction (editing) of segmented data. A novel algorithm combining radial sampling and machine learning is presented for the automated control point search (ACPS). Four data sets with 3 T MRI structural images were used for ACPS validation, including raw data acquired twice in 36 healthy subjects and both raw and FreeSurfer preprocessed data of 125 healthy subjects from public databases. The unedited data from a subgroup of subjects were submitted to manual control point search and editing. The ACPS algorithm was trained on manual control points and tested on new (unseen) unedited data. Cortical thickness (CT) and fractal dimensionality (FD) were estimated in three data sets by reconstructing surfaces from both unedited and edited data, and the effects of editing were compared between manual and automated editing and versus no editing. The ACPS-based editing improved the surface reconstructions similarly to manual editing. Compared to no editing, ACPS-based and manual editing significantly reduced CT and FD in consistent regions across different data sets. Despite the extra processing of control point driven reconstructions, CT and FD estimates were highly reproducible in almost all cortical regions, albeit some problematic regions (e.g. entorhinal cortex) may benefit from different editing. The use of control points improves the surface reconstruction and the ACPS algorithm can automate their search reducing the burden of manual editing. Copyright © 2018 Elsevier Inc. All rights reserved.
Chhabra, Anmol; Quinn, Andrea; Ries, Amanda
2018-01-01
Accurate history collection is integral to medication reconciliation. Studies support pharmacy involvement in the process, but assessment of global time spent is limited. The authors hypothesized the location of a medication-focused interview would impact time spent. The objective was to compare time spent by pharmacists and nurses based on the location of a medication-focused interview. Time spent by the interviewing pharmacist, admitting nurse, and centralized pharmacist verifying admission orders was collected. Patient groups were based on whether the interview was conducted in the emergency department (ED) or medical floor. The primary end point was a composite of the 3 time points. Secondary end points were individual time components and number and types of transcription discrepancies identified during medical floor interviews. Pharmacists and nurses spent an average of ten fewer minutes per ED patient versus a medical floor patient ( P = .028). Secondary end points were not statistically significant. Transcription discrepancies were identified at a rate of 1 in 4 medications. Post hoc analysis revealed the time spent by pharmacists and nurses was 2.4 minutes shorter per medication when interviewed in the ED ( P < .001). The primary outcome was statistically and clinically significant. Limitations included inability to blind and lack of cost-saving analysis. Pharmacist involvement in ED medication reconciliation leads to time savings during the admission process.
Zinc finger point mutations within the WT1 gene in Wilms tumor patients.
Little, M H; Prosser, J; Condie, A; Smith, P J; Van Heyningen, V; Hastie, N D
1992-01-01
A proposed Wilms tumor gene, WT1, which encodes a zinc finger protein, has previously been isolated from human chromosome 11p13. Chemical mismatch cleavage analysis was used to identify point mutations in the zinc finger region of this gene in a series of 32 Wilms tumors. Two exonic single base changes were detected. In zinc finger 3 of a bilateral Wilms tumor patient, a constitutional de novo C----T base change was found changing an arginine to a stop codon. One tumor from this patient showed allele loss leading to 11p hemizygosity of the abnormal allele. In zinc finger 2 of a sporadic Wilms tumor patient, a C----T base change resulted in an arginine to cysteine amino acid change. To our knowledge, a WT1 gene missense mutation has not been detected previously in a Wilms tumor. By comparison with a recent NMR and x-ray crystallographic analysis of an analogous zinc finger gene, early growth response gene 1 (EGR1), this amino acid change in WT1 occurs at a residue predicted to be critical for DNA binding capacity and site specificity. The detection of one nonsense point mutation and one missense WT1 gene point mutation adds to the accumulating evidence implicating this gene in a proportion of Wilms tumor patients. Images PMID:1317572
Rindermann, Heiner; Thompson, James
2016-01-01
Immigration, immigration policies and education of immigrants alter competence levels. This study analysed their effects using PISA, TIMSS and PIRLS data (1995 to 2012, N=93 nations) for natives' and immigrants' competences, competence gaps and their population proportions. The mean gap is equivalent to 4.71 IQ points. There are large differences across countries in these gaps ranging from around +12 to -10 IQ points. Migrants' proportions grow roughly 4% per decade. The largest immigrant-based 'brain gains' are observed for Arabian oil-based economies, and the largest 'brain losses' for Central Europe. Regarding causes of native-immigrant gaps, language problems do not seem to explain them. However, English-speaking countries show an advantage. Acculturation within one generation and intermarriage usually reduce native-immigrant gaps (≅1 IQ point). National educational quality reduces gaps, especially school enrolment at a young age, the use of tests and school autonomy. A one standard deviation increase in school quality represents a closing of around 1 IQ point in the native-immigrant gap. A new Greenwich IQ estimation based on UK natives' cognitive ability mean is recommended. An analysis of the first adult OECD study PIAAC revealed that larger proportions of immigrants among adults reduce average competence levels and positive Flynn effects. The effects on economic development and suggestions for immigration and educational policy are discussed.
Students' Perceptions of Their ICT-Based College English Course in China: A Case Study
ERIC Educational Resources Information Center
Zinan, Wen; Sai, George Teoh Boon
2017-01-01
This study investigated foreign language students' perceptions about their Information and Communication Technology (ICT)-based College English Course (CEC) in China. The research used a five-point Likert-scale questionnaire based on Simsek (2008). A factor analysis confirmed the construct validity of the questionnaire and 6 factors were…
[Evaluation of view points in forest park based on landscape sensitivity].
Zhou, Rui; Li, Yue-hui; Hu, Yuan-man; Liu, Miao
2008-11-01
Based on topographical characteristics, five factors including comparative slope, comparative distance, mutual visibility, vision probability, and striking degree were chosen to assess the landscape sensitivity of major view points in Houshi National Forest Park. Spatial analysis in GIS was used for exploring the theory and method of landscape sensitivity of view points. The results showed that in the Park, there were totally 23 view points, but none of them reached up to class I. Among the 23 points, 10 were of class II , accounting for 43.5% of the total, 8 were of class III, accounting for 34.8%, and 5 were of classes IV and V, accounting for 21.7%. Around the view points of class II, the landscape should be strictly protected to maintain their natural feature; around the view points of class III, human-made landscape points should be developed according to the natural landscape feature, and wide tourism roads and small-size buildings could be constructed but the style of the buildings should be harmonious with surrounding nature landscape; while around the view points of classes IV and V, large-size multifunctional items and roads could be built to perfect the natural landscape. Through the multi-perspective and quantitative evaluation of landscape sensitivity, this study enriched the theory of landscape visual assessment and landscape apperception, and provided scientific base and direction for the planning and management of forest parks and other tourism areas.
Paper-based analytical devices for environmental analysis.
Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S
2016-03-21
The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.
NASA Astrophysics Data System (ADS)
Didenko, A. N.; Nosyrev, M. Yu.; Shevchenko, B. F.; Gilmanova, G. Z.
2017-11-01
The depth of the base of the magnetoactive layer and the geothermal gradient in the Sikhote Alin crust are estimated based on a method determining the Curie depth point of magnetoactive masses by using spectral analysis of the anomalous magnetic field. A detailed map of the geothermal gradient is constructed for the first time for the Sikhote Alin and adjacent areas of the Central Asian belt. Analysis of this map shows that the zones with a higher geothermal gradient geographically fit the areas with a higher level of seismicity.
Optical texture analysis for automatic cytology and histology: a Markovian approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pressman, N.J.
1976-10-12
Markovian analysis is a method to measure optical texture based on gray-level transition probabilities in digitized images. The experiments described in this dissertation investigate the classification performance of parameters generated by this method. Three types of data sets are used: images of (1) human blood leukocytes (nuclei of monocytes, neutrophils, and lymphocytes; Wright stain; (0.125 ..mu..m)/sup 2//picture point), (2) cervical exfoliative cells (nuclei of normal intermediate squamous cells and dysplastic and carcinoma in situ cells; azure-A/Feulgen stain; (0.125 ..mu..m)/sup 2//picture point), and (3) lymph-node tissue sections (6-..mu..m thick sections from normal, acute lymphadenitis, and Hodgkin lymph nodes; hematoxylin and eosinmore » stain; (0.625 ..mu..m)/sup 2/ picture point). Each image consists of 128 x 128 picture points originally scanned with a 256 gray-level resolution. Each image class is defined by 75 images.« less
Clinically relevant advances in on-chip affinity-based electrophoresis and electrochromatography.
Hou, Chenlu; Herr, Amy E
2008-08-01
Clinical and point-of-care disease diagnostics promise to play an important role in personalized medicine, new approaches to global health, and health monitoring. Emerging instrument platforms based on lab-on-a-chip technology can confer performance advantages successfully exploited in electrophoresis and electrochromatography to affinity-based electrokinetic separations. This review surveys lab-on-a-chip diagnostic developments in affinity-based electrokinetic separations for quantitation of proteins, integration of preparatory functions needed for subsequent analysis of diverse biological samples, and initial forays into multiplexed analyses. The technologies detailed here underpin new clinical and point-of-care diagnostic strategies. The techniques and devices promise to advance translation of until now laboratory-based sample preparation and analytical assays to near-patient settings.
Floquet stability analysis of the longitudinal dynamics of two hovering model insects
Wu, Jiang Hao; Sun, Mao
2012-01-01
Because of the periodically varying aerodynamic and inertial forces of the flapping wings, a hovering or constant-speed flying insect is a cyclically forcing system, and, generally, the flight is not in a fixed-point equilibrium, but in a cyclic-motion equilibrium. Current stability theory of insect flight is based on the averaged model and treats the flight as a fixed-point equilibrium. In the present study, we treated the flight as a cyclic-motion equilibrium and used the Floquet theory to analyse the longitudinal stability of insect flight. Two hovering model insects were considered—a dronefly and a hawkmoth. The former had relatively high wingbeat frequency and small wing-mass to body-mass ratio, and hence very small amplitude of body oscillation; while the latter had relatively low wingbeat frequency and large wing-mass to body-mass ratio, and hence relatively large amplitude of body oscillation. For comparison, analysis using the averaged-model theory (fixed-point stability analysis) was also made. Results of both the cyclic-motion stability analysis and the fixed-point stability analysis were tested by numerical simulation using complete equations of motion coupled with the Navier–Stokes equations. The Floquet theory (cyclic-motion stability analysis) agreed well with the simulation for both the model dronefly and the model hawkmoth; but the averaged-model theory gave good results only for the dronefly. Thus, for an insect with relatively large body oscillation at wingbeat frequency, cyclic-motion stability analysis is required, and for their control analysis, the existing well-developed control theories for systems of fixed-point equilibrium are no longer applicable and new methods that take the cyclic variation of the flight dynamics into account are needed. PMID:22491980
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Hussain, Aquila; Katiyar, Vivek
2010-01-01
A unified framework is presented that enables coupled multiscale analysis of composite structures and associated graphical pre- and postprocessing within the Abaqus/CAE environment. The recently developed, free, Finite Element Analysis--Micromechanics Analysis Code (FEAMAC) software couples NASA's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with Abaqus/Standard and Abaqus/Explicit to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. The Graphical User Interfaces (FEAMAC-Pre and FEAMAC-Post), developed through collaboration between SIMULIA Erie and the NASA Glenn Research Center, enable users to employ a new FEAMAC module within Abaqus/CAE that provides access to the composite microscale. FEA IAC-Pre is used to define and store constituent material properties, set-up and store composite repeating unit cells, and assign composite materials as sections with all data being stored within the CAE database. Likewise FEAMAC-Post enables multiscale field quantity visualization (contour plots, X-Y plots), with point and click access to the microscale i.e., fiber and matrix fields).
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
Lee, Sangdae; Kim, Giyoung; Moon, Jihea
2013-04-18
This study was conducted to develop a simple, rapid, and accurate lateral flow immunoassay (LFIA) detection method for point-of-care diagnosis. The one-dot LFIA for aflatoxin B1 (AFB1) was based on the modified competitive binding format using competition between AFB1 and colloidal gold-AFB1-BSA conjugate for antibody binding sites in the test zone. A Smartphone-based reading system consisting of a Samsung Galaxy S2 Smartphone, a LFIA reader, and a Smartphone application for the image acquisition and data analysis. The detection limit of one-dot LFIA for AFB1 is 5 μg/kg. This method provided semi-quantitative analysis of AFB1 samples in the range of 5 to 1,000 μg/kg. Using combination of the one-dot LFIA and the Smartphone-based reading system, it is possible to conduct a more fast and accurate point-of-care diagnosis.
Lee, Sangdae; Kim, Giyoung; Moon, Jihea
2013-01-01
This study was conducted to develop a simple, rapid, and accurate lateral flow immunoassay (LFIA) detection method for point-of-care diagnosis. The one-dot LFIA for aflatoxin B1 (AFB1) was based on the modified competitive binding format using competition between AFB1 and colloidal gold-AFB1-BSA conjugate for antibody binding sites in the test zone. A Smartphone-based reading system consisting of a Samsung Galaxy S2 Smartphone, a LFIA reader, and a Smartphone application for the image acquisition and data analysis. The detection limit of one-dot LFIA for AFB1 is 5 μg/kg. This method provided semi-quantitative analysis of AFB1 samples in the range of 5 to 1,000 μg/kg. Using combination of the one-dot LFIA and the Smartphone-based reading system, it is possible to conduct a more fast and accurate point-of-care diagnosis. PMID:23598499
Sample to answer visualization pipeline for low-cost point-of-care blood cell counting
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter
2015-03-01
We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.
Spiral bacterial foraging optimization method: Algorithm, evaluation and convergence analysis
NASA Astrophysics Data System (ADS)
Kasaiezadeh, Alireza; Khajepour, Amir; Waslander, Steven L.
2014-04-01
A biologically-inspired algorithm called Spiral Bacterial Foraging Optimization (SBFO) is investigated in this article. SBFO, previously proposed by the same authors, is a multi-agent, gradient-based algorithm that minimizes both the main objective function (local cost) and the distance between each agent and a temporary central point (global cost). A random jump is included normal to the connecting line of each agent to the central point, which produces a vortex around the temporary central point. This random jump is also suitable to cope with premature convergence, which is a feature of swarm-based optimization methods. The most important advantages of this algorithm are as follows: First, this algorithm involves a stochastic type of search with a deterministic convergence. Second, as gradient-based methods are employed, faster convergence is demonstrated over GA, DE, BFO, etc. Third, the algorithm can be implemented in a parallel fashion in order to decentralize large-scale computation. Fourth, the algorithm has a limited number of tunable parameters, and finally SBFO has a strong certainty of convergence which is rare in existing global optimization algorithms. A detailed convergence analysis of SBFO for continuously differentiable objective functions has also been investigated in this article.
Traffic sign detection in MLS acquired point clouds for geometric and image-based semantic inventory
NASA Astrophysics Data System (ADS)
Soilán, Mario; Riveiro, Belén; Martínez-Sánchez, Joaquín; Arias, Pedro
2016-04-01
Nowadays, mobile laser scanning has become a valid technology for infrastructure inspection. This technology permits collecting accurate 3D point clouds of urban and road environments and the geometric and semantic analysis of data became an active research topic in the last years. This paper focuses on the detection of vertical traffic signs in 3D point clouds acquired by a LYNX Mobile Mapper system, comprised of laser scanning and RGB cameras. Each traffic sign is automatically detected in the LiDAR point cloud, and its main geometric parameters can be automatically extracted, therefore aiding the inventory process. Furthermore, the 3D position of traffic signs are reprojected on the 2D images, which are spatially and temporally synced with the point cloud. Image analysis allows for recognizing the traffic sign semantics using machine learning approaches. The presented method was tested in road and urban scenarios in Galicia (Spain). The recall results for traffic sign detection are close to 98%, and existing false positives can be easily filtered after point cloud projection. Finally, the lack of a large, publicly available Spanish traffic sign database is pointed out.
Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.
Omer, Travis; Intes, Xavier; Hahn, Juergen
2015-01-01
Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.
Analysis of an optimization-based atomistic-to-continuum coupling method for point defects
Olson, Derek; Shapeev, Alexander V.; Bochev, Pavel B.; ...
2015-11-16
Here, we formulate and analyze an optimization-based Atomistic-to-Continuum (AtC) coupling method for problems with point defects. Application of a potential-based atomistic model near the defect core enables accurate simulation of the defect. Away from the core, where site energies become nearly independent of the lattice position, the method switches to a more efficient continuum model. The two models are merged by minimizing the mismatch of their states on an overlap region, subject to the atomistic and continuum force balance equations acting independently in their domains. We prove that the optimization problem is well-posed and establish error estimates.
A Study of Impact Point Detecting Method Based on Seismic Signal
NASA Astrophysics Data System (ADS)
Huo, Pengju; Zhang, Yu; Xu, Lina; Huang, Yong
The projectile landing position has to be determined for its recovery and range in the targeting test. In this paper, a global search method based on the velocity variance is proposed. In order to verify the applicability of this method, simulation analysis within the scope of four million square meters has been conducted in the same array structure of the commonly used linear positioning method, and MATLAB was used to compare and analyze the two methods. The compared simulation results show that the global search method based on the speed of variance has high positioning accuracy and stability, which can meet the needs of impact point location.
NASA Technical Reports Server (NTRS)
Hyde, G.
1976-01-01
The 13/18 GHz COMSAT Propagation Experiment (CPE) was performed to measure attenuation caused by hydrometeors along slant paths from transmitting terminals on the ground to the ATS-6 satellite. The effectiveness of site diversity in overcoming this impairment was also studied. Problems encountered in assembling a valid data base of rain induced attenuation data for statistical analysis are considered. The procedures used to obtain the various statistics are then outlined. The graphs and tables of statistical data for the 15 dual frequency (13 and 18 GHz) site diversity locations are discussed. Cumulative rain rate statistics for the Fayetteville and Boston sites based on point rainfall data collected are presented along with extrapolations of the attenuation and point rainfall data.
Control strategy of grid-connected photovoltaic generation system based on GMPPT method
NASA Astrophysics Data System (ADS)
Wang, Zhongfeng; Zhang, Xuyang; Hu, Bo; Liu, Jun; Li, Ligang; Gu, Yongqiang; Zhou, Bowen
2018-02-01
There are multiple local maximum power points when photovoltaic (PV) array runs under partial shading condition (PSC).However, the traditional maximum power point tracking (MPPT) algorithm might be easily trapped in local maximum power points (MPPs) and cannot find the global maximum power point (GMPP). To solve such problem, a global maximum power point tracking method (GMPPT) is improved, combined with traditional MPPT method and particle swarm optimization (PSO) algorithm. Under different operating conditions of PV cells, different tracking algorithms are used. When the environment changes, the improved PSO algorithm is adopted to realize the global optimal search, and the variable step incremental conductance (INC) method is adopted to achieve MPPT in optimal local location. Based on the simulation model of the PV grid system built in Matlab/Simulink, comparative analysis of the tracking effect of MPPT by the proposed control algorithm and the traditional MPPT method under the uniform solar condition and PSC, validate the correctness, feasibility and effectiveness of the proposed control strategy.
From Invention to Innovation: Risk Analysis to Integrate One Health Technology in the Dairy Farm
Lombardo, Andrea; Boselli, Carlo; Amatiste, Simonetta; Ninci, Simone; Frazzoli, Chiara; Dragone, Roberto; De Rossi, Alberto; Grasso, Gerardo; Mantovani, Alberto; Brajon, Giovanni
2017-01-01
Current Hazard Analysis Critical Control Points (HACCP) approaches mainly fit for food industry, while their application in primary food production is still rudimentary. The European food safety framework calls for science-based support to the primary producers’ mandate for legal, scientific, and ethical responsibility in food supply. The multidisciplinary and interdisciplinary project ALERT pivots on the development of the technological invention (BEST platform) and application of its measurable (bio)markers—as well as scientific advances in risk analysis—at strategic points of the milk chain for time and cost-effective early identification of unwanted and/or unexpected events of both microbiological and toxicological nature. Health-oriented innovation is complex and subject to multiple variables. Through field activities in a dairy farm in central Italy, we explored individual components of the dairy farm system to overcome concrete challenges for the application of translational science in real life and (veterinary) public health. Based on an HACCP-like approach in animal production, the farm characterization focused on points of particular attention (POPAs) and critical control points to draw a farm management decision tree under the One Health view (environment, animal health, food safety). The analysis was based on the integrated use of checklists (environment; agricultural and zootechnical practices; animal health and welfare) and laboratory analyses of well water, feed and silage, individual fecal samples, and bulk milk. The understanding of complex systems is a condition to accomplish true innovation through new technologies. BEST is a detection and monitoring system in support of production security, quality and safety: a grid of its (bio)markers can find direct application in critical points for early identification of potential hazards or anomalies. The HACCP-like self-monitoring in primary production is feasible, as well as the biomonitoring of live food producing animals as sentinel population for One Health. PMID:29218304
Chakraborty, Nalanda; Logan, Kenneth J
To examine the effects of measurement method and transcript availability on the accuracy, reliability, and efficiency of inexperienced raters' stuttering frequency measurements. 44 adults, all inexperienced at evaluating stuttered speech, underwent 20 min of preliminary training in stuttering measurement and then analyzed a series of sentences, with and without access to transcripts of sentence stimuli, using either a syllable-based analysis (SBA) or an utterance-based analysis (UBA). Participants' analyses were compared between groups and to a composite analysis from two experienced evaluators. Stuttering frequency scores from the SBA and UBA groups differed significantly from the experienced evaluators' scores; however, UBA scores were significantly closer to the experienced evaluators' scores and were completed significantly faster than the SBA scores. Transcript availability facilitated scoring accuracy and efficiency in both groups. The internal reliability of stuttering frequency scores was acceptable for the SBA and UBA groups; however, the SBA group demonstrated only modest point-by-point agreement with ratings from the experienced evaluators. Given its accuracy and efficiency advantages over syllable-based analysis, utterance-based fluency analysis appears to be an appropriate context for introducing stuttering frequency measurement to raters who have limited experience in stuttering measurement. To address accuracy gaps between experienced and inexperienced raters, however, use of either analysis must be supplemented with training activities that expose inexperienced raters to the decision-making processes used by experienced raters when identifying stuttered syllables. Copyright © 2018 Elsevier Inc. All rights reserved.
[The added value of information summaries supporting clinical decisions at the point-of-care.
Banzi, Rita; González-Lorenzo, Marien; Kwag, Koren Hyogene; Bonovas, Stefanos; Moja, Lorenzo
2016-11-01
Evidence-based healthcare requires the integration of the best research evidence with clinical expertise and patients' values. International publishers are developing evidence-based information services and resources designed to overcome the difficulties in retrieving, assessing and updating medical information as well as to facilitate a rapid access to valid clinical knowledge. Point-of-care information summaries are defined as web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. Their validity must be assessed against marketing claims that they are evidence-based. We periodically evaluate the content development processes of several international point-of-care information summaries. The number of these products has increased along with their quality. The last analysis done in 2014 identified 26 products and found that three of them (Best Practice, Dynamed e Uptodate) scored the highest across all evaluated dimensions (volume, quality of the editorial process and evidence-based methodology). Point-of-care information summaries as stand-alone products or integrated with other systems, are gaining ground to support clinical decisions. The choice of one product over another depends both on the properties of the service and the preference of users. However, even the most innovative information system must rely on transparent and valid contents. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.
Meta-Analysis of Scale Reliability Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2013-01-01
A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…
Theoretical Analysis of Thermodynamic Measurements near a Liquid-Gas Critical Point
NASA Technical Reports Server (NTRS)
Barmatz, M.; Zhong, Fang; Hahn, Inseob
2003-01-01
Over the years, many ground-based studies have been performed near liquid-gas critical points to elucidate the expected divergences in thermodynamic quantities. The unambiguous interpretation of these studies very near the critical point is hindered by a gravity-induced density stratification. However, these ground-based measurements can give insight into the crossover behavior between the asymptotic critical region near the transition and the mean field region farther away. We have completed a detailed analysis of heat capacity, susceptibility and coexistence curve measurements near the He-3 liquid-gas critical point using the minimal-subtraction renormalization (MSR) scheme within the phi(exp 4) model. This MSR scheme, using only two adjustable parameters, provides a reasonable global fit to all of these experimental measurements in the gravity-free region out to a reduced temperature of |t| approx. 2x10(exp -2). Recently this approach has also been applied to the earlier microgravity measurements of Haupt and Straub in SF(sub 6) with surprising results. The conclusions drawn from the MSR analyses will be presented. Measurements in the gravity-affected region closer to the He-3 critical point have also been analyzed using the recent crossover parametric model (CPM) of the equation-of-state. The results of fitting heat capacity measurements to the CPM model along the He-3 critical isochore in the gravity-affected region will also be presented.
Wayfinding concept in University of Brawijaya
NASA Astrophysics Data System (ADS)
Firjatullah, H.; Kurniawan, E. B.; Purnamasari, W. D.
2017-06-01
Wayfinding is an activity related to the orientation and motion from first point to point of destination. Benefits of wayfinding in the area of education, namely as a means of helping direct a person to a destination so as to reduce the lostness and assist users in remembering the way through the role of space and objects wayfinding. Around 48% new students of University of Brawijaya (UB) 2015 that was ever lost when entering the campus. This shows the need for wayfinding concept to someone who was unfamiliar with the surrounding area as freshmen. This study uses mental map analysis to find the objects hierarchy wayfinding determination based on the respondents and the space syntax (visual graph analysis) as a hierarchy based on the determination of the configuration space. The overlay result say that hierarchy between both of analysis shows there are several objects which are potentially in wayfinding process on the campus of UB. The concept of wayfinding generate different treatment of the selected object based of wayfinding classification, both in maintaining the function of the object in space and develop the potential of the object wayfinding.
An analysis of neural receptive field plasticity by point process adaptive filtering
Brown, Emery N.; Nguyen, David P.; Frank, Loren M.; Wilson, Matthew A.; Solo, Victor
2001-01-01
Neural receptive fields are plastic: with experience, neurons in many brain regions change their spiking responses to relevant stimuli. Analysis of receptive field plasticity from experimental measurements is crucial for understanding how neural systems adapt their representations of relevant biological information. Current analysis methods using histogram estimates of spike rate functions in nonoverlapping temporal windows do not track the evolution of receptive field plasticity on a fine time scale. Adaptive signal processing is an established engineering paradigm for estimating time-varying system parameters from experimental measurements. We present an adaptive filter algorithm for tracking neural receptive field plasticity based on point process models of spike train activity. We derive an instantaneous steepest descent algorithm by using as the criterion function the instantaneous log likelihood of a point process spike train model. We apply the point process adaptive filter algorithm in a study of spatial (place) receptive field properties of simulated and actual spike train data from rat CA1 hippocampal neurons. A stability analysis of the algorithm is sketched in the Appendix. The adaptive algorithm can update the place field parameter estimates on a millisecond time scale. It reliably tracked the migration, changes in scale, and changes in maximum firing rate characteristic of hippocampal place fields in a rat running on a linear track. Point process adaptive filtering offers an analytic method for studying the dynamics of neural receptive fields. PMID:11593043
NASA Astrophysics Data System (ADS)
Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia
2015-12-01
Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.
Manual for automatic generation of finite element models of spiral bevel gears in mesh
NASA Technical Reports Server (NTRS)
Bibel, G. D.; Reddy, S.; Kumar, A.
1994-01-01
The goal of this research is to develop computer programs that generate finite element models suitable for doing 3D contact analysis of faced milled spiral bevel gears in mesh. A pinion tooth and a gear tooth are created and put in mesh. There are two programs: Points.f and Pat.f to perform the analysis. Points.f is based on the equation of meshing for spiral bevel gears. It uses machine tool settings to solve for an N x M mesh of points on the four surfaces, pinion concave and convex, and gear concave and convex. Points.f creates the file POINTS.OUT, an ASCI file containing N x M points for each surface. (N is the number of node points along the length of the tooth, and M is nodes along the height.) Pat.f reads POINTS.OUT and creates the file tl.out. Tl.out is a series of PATRAN input commands. In addition to the mesh density on the tooth face, additional user specified variables are the number of finite elements through the thickness, and the number of finite elements along the tooth full fillet. A full fillet is assumed to exist for both the pinion and gear.
Quantum coordinated multi-point communication based on entanglement swapping
NASA Astrophysics Data System (ADS)
Du, Gang; Shang, Tao; Liu, Jian-wei
2017-05-01
In a quantum network, adjacent nodes can communicate with each other point to point by using pre-shared Einsten-Podolsky-Rosen (EPR) pairs, and furthermore remote nodes can establish entanglement channels by using quantum routing among intermediate nodes. However, with the rapid development of quantum networks, the demand of various message transmission among nodes inevitably emerges. In order to realize this goal and extend quantum networks, we propose a quantum coordinated multi-point communication scheme based on entanglement swapping. The scheme takes full advantage of EPR pairs between adjacent nodes and performs multi-party entanglement swapping to transmit messages. Considering various demands of communication, all nodes work cooperatively to realize different message transmission modes, including one to many, many to one and one to some. Scheme analysis shows that the proposed scheme can flexibly organize a coordinated group and efficiently use EPR resources, while it meets basic security requirement under the condition of coordinated communication.
NASA Astrophysics Data System (ADS)
Chinn, Pauline W. U.
2011-03-01
This response to Mitchell and Mueller's "A philosophical analysis of David Orr's theory of ecological literacy" comments on their critique of Orr's use of the phrase "ecological crisis" and what I perceive as their conflicting views of "crisis." I present my views on ecological crisis informed by standpoint theory and the definition of crisis as turning point. I connect the concept of turning point to tipping point as used in ecology to describe potentially irreversible changes in coupled social-ecological systems. I suggest that sustainable societies may provide models of adaptive learning in which monitoring of ecological phenomena is coupled to human behavior to mitigate threats to sustainability before a crisis/tipping point is reached. Finally, I discuss the Hawai`i State Department of Education's removal of its Indigenous science content standard Mālama I Ka `Āina, Sustainability and its continued use in community-based projects.
Smid, Dionne E; Franssen, Frits M E; Houben-Wilke, Sarah; Vanfleteren, Lowie E G W; Janssen, Daisy J A; Wouters, Emiel F M; Spruit, Martijn A
2017-01-01
Pulmonary rehabilitation enhances health status and mood status in patients with chronic obstructive pulmonary disease (COPD). The aim was to determine the responsiveness of St. George's Respiratory Questionnaire (SGRQ), COPD Assessment Test (CAT), COPD Clinical Questionnaire (CCQ), and Hospital Anxiety and Depression Scale (HADS) to pulmonary rehabilitation in patients with COPD, and estimate minimum clinically important differences (MCIDs) for CAT, CCQ, and HADS. A prospective analysis. MCIDs were estimated with anchor-based (anchor: SGRQ) and distribution-based methods. Newly estimated MCIDs were compared to known MCID estimates from a systematic literature search. Newly estimated MCIDs were calculated in patients treated in pulmonary rehabilitation. A subsample of 419 individuals with COPD (55.4% male, mean age 64.3 ± 8.8 years) were included from the Chance study. Health status was measured with SGRQ, CAT, and CCQ, before and after pulmonary rehabilitation. Mood status was assessed using HADS. 419 patients with COPD (forced expiratory volume in the first second 37.3% ± 12.1% predicted) completed pulmonary rehabilitation. SGRQ (-9.1 ± 14.0 points), CAT (-3.0 ± 6.8 points), CCQ (-0.6 ± 0.9 points), HADS-Anxiety (-1.7 ± 3.7 points), and HADS-Depression (-2.1 ± 3.7 points) improved significantly. New MCIDs were estimated for CAT (range: -3.8 to -1.0 points), CCQ (range: -0.8 to -0.2 points), HADS-Anxiety (range: -2.0 to -1.1 points), and HADS-Depression (range: -1.8 to -1.4 points). The SGRQ, CAT, CCQ, and HADS are responsive to pulmonary rehabilitation in patients with COPD. We propose MCID estimates ranging between -3.0 and -2.0 points for CAT; -0.5 and -0.3 points for CCQ, -1.8 and -1.3 points for HADS-Anxiety, and -1.7 and -1.5 points for HADS-Depression. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Method for cold stable biojet fuel
Seames, Wayne S.; Aulich, Ted
2015-12-08
Plant or animal oils are processed to produce a fuel that operates at very cold temperatures and is suitable as an aviation turbine fuel, a diesel fuel, a fuel blendstock, or any fuel having a low cloud point, pour point or freeze point. The process is based on the cracking of plant or animal oils or their associated esters, known as biodiesel, to generate lighter chemical compounds that have substantially lower cloud, pour, and/or freeze points than the original oil or biodiesel. Cracked oil is processed using separation steps together with analysis to collect fractions with desired low temperature properties by removing undesirable compounds that do not possess the desired temperature properties.
Mazaro, José Vitor Quinelli; Gennari Filho, Humberto; Vedovatto, Eduardo; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza; Zavanelli, Adriana Cristina
2011-09-01
The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization. A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc). Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric. All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.
Strandberg, Gunnar; Eriksson, Mats; Gustafsson, Mats G; Lipcsey, Miklós; Larsson, Anders
2012-11-01
Intraosseous access is an essential method in emergency medicine when other forms of vascular access are unavailable and there is an urgent need for fluid or drug therapy. A number of publications have discussed the suitability of using intraosseous access for laboratory testing. We aimed to further evaluate this issue and to study the accuracy and precision of intraosseous measurements. Five healthy, anaesthetised pigs were instrumented with bilateral tibial intraosseous cannulae and an arterial catheter. Samples were collected hourly for 6h and analysed for blood gases, acid base status, haemoglobin and electrolytes using an I-Stat point of care analyser. There was no clinically relevant difference between results from left and right intraosseous sites. The variability of the intraosseous sample values, measured as the coefficient of variance (CV), was maximally 11%, and smaller than for the arterial sample values for all variables except SO2. For most variables, there seems to be some degree of systematic difference between intraosseous and arterial results. However, the direction of this difference seems to be predictable. Based on our findings in this animal model, cartridge based point of care instruments appear suitable for the analysis of intraosseous samples. The agreement between intraosseous and arterial analysis seems to be good enough for the method to be clinically useful. The precision, quantified in terms of CV, is at least as good for intraosseous as for arterial analysis. There is no clinically important difference between samples from left and right tibia, indicating a good reproducibility. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing
NASA Technical Reports Server (NTRS)
Goddard, R. E.
1992-01-01
Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the DSN 70-m antenna sub network, operating at Ka-band (1-cm wavelength).
Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing
NASA Technical Reports Server (NTRS)
Goddard, R. E.
1992-01-01
Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the Deep Space Network 70-m antenna subnetwork, operating at Ka-band (1-cm wavelength).
Habitat classification modeling with incomplete data: Pushing the habitat envelope
Zarnetske, P.L.; Edwards, T.C.; Moisen, Gretchen G.
2007-01-01
Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical techniques for HCMs. One option is to generate pseudo-absence points so that the many available statistical modeling tools can be used. Traditional techniques generate pseudoabsence points at random across broadly defined species ranges, often failing to include biological knowledge concerning the species-habitat relationship. We incorporated biological knowledge of the species-habitat relationship into pseudo-absence points by creating habitat envelopes that constrain the region from which points were randomly selected. We define a habitat envelope as an ecological representation of a species, or species feature's (e.g., nest) observed distribution (i.e., realized niche) based on a single attribute, or the spatial intersection of multiple attributes. We created HCMs for Northern Goshawk (Accipiter gentilis atricapillus) nest habitat during the breeding season across Utah forests with extant nest presence points and ecologically based pseudo-absence points using logistic regression. Predictor variables were derived from 30-m USDA Landfire and 250-m Forest Inventory and Analysis (FIA) map products. These habitat-envelope-based models were then compared to null envelope models which use traditional practices for generating pseudo-absences. Models were assessed for fit and predictive capability using metrics such as kappa, thresholdindependent receiver operating characteristic (ROC) plots, adjusted deviance (Dadj2), and cross-validation, and were also assessed for ecological relevance. For all cases, habitat envelope-based models outperformed null envelope models and were more ecologically relevant, suggesting that incorporating biological knowledge into pseudo-absence point generation is a powerful tool for species habitat assessments. Furthermore, given some a priori knowledge of the species-habitat relationship, ecologically based pseudo-absence points can be applied to any species, ecosystem, data resolution, and spatial extent. ?? 2007 by the Ecological Society of America.
Impact of Mobile Assisted Language Learning (MALL) on EFL: A Meta-Analysis
ERIC Educational Resources Information Center
Taj, Imtiaz Hassan; Sulan, Norrihan Binti; Sipra, Muhammad Aslam; Ahmad, Waqar
2016-01-01
Mobile Assisted Language Learning (MALL) has emerged as a potential tool in the instruction of English as a foreign language (EFL). Meta-analysis of 13 studies published between year 2008 and 2015 was conducted. Four point criteria for the selection of studies for analysis is based on the year of publication, quasi-experimental design, pretest and…
Investigation of the dew-point temperature scale maintained at the MIKES
NASA Astrophysics Data System (ADS)
Heinonen, Martti
1999-01-01
For the investigation of the dew-point temperature scale realized by the MIKES primary dew-point generator, a two-pressure generator and a dew-point indicator based on condensation in a cooled coil were constructed and tested. In addition, a chilled mirror hygrometer was validated by means of an uncertainty analysis. The comparison of these systems was focused on the dew-point temperature range from 0957-0233/10/1/010/img1 to 0957-0233/10/1/010/img2 but measurements were made up to 0957-0233/10/1/010/img3. The generator systems were compared using a dew-point comparator based on two relative humidity sensors. According to the results of the comparisons, the differences between the measurement systems were less than 0957-0233/10/1/010/img4, while the expanded uncertainty of the MIKES generator was between 0957-0233/10/1/010/img5 and 0957-0233/10/1/010/img6. The uncertainty of the other systems was from 0957-0233/10/1/010/img7 to 0957-0233/10/1/010/img8. It was concluded that the dew-point temperature scale was not dependent on the realization method.
Yang, Qing; Fan, Liu-Yin; Huang, Shan-Sheng; Zhang, Wei; Cao, Cheng-Xi
2011-04-01
In this paper, we developed a novel method of acid-base titration, viz. the electromigration acid-base titration (EABT), via a moving neutralization boundary (MNR). With HCl and NaOH as the model strong acid and base, respectively, we conducted the experiments on the EABT via the method of moving neutralization boundary for the first time. The experiments revealed that (i) the concentration of agarose gel, the voltage used and the content of background electrolyte (KCl) had evident influence on the boundary movement; (ii) the movement length was a function of the running time under the constant acid and base concentrations; and (iii) there was a good linearity between the length and natural logarithmic concentration of HCl under the optimized conditions, and the linearity could be used to detect the concentration of acid. The experiments further manifested that (i) the RSD values of intra-day and inter-day runs were less than 1.59 and 3.76%, respectively, indicating similar precision and stability in capillary electrophoresis or HPLC; (ii) the indicators with different pK(a) values had no obvious effect on EABT, distinguishing strong influence on the judgment of equivalence-point titration in the classic one; and (iii) the constant equivalence-point titration always existed in the EABT, rather than the classic volumetric analysis. Additionally, the EABT could be put to good use for the determination of actual acid concentrations. The experimental results achieved herein showed a new general guidance for the development of classic volumetric analysis and element (e.g. nitrogen) content analysis in protein chemistry. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structure Line Detection from LIDAR Point Clouds Using Topological Elevation Analysis
NASA Astrophysics Data System (ADS)
Lo, C. Y.; Chen, L. C.
2012-07-01
Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA) to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.
Analysis of Aerospike Plume Induced Base-Heating Environment
NASA Technical Reports Server (NTRS)
Wang, Ten-See
1998-01-01
Computational analysis is conducted to study the effect of an aerospike engine plume on X-33 base-heating environment during ascent flight. To properly account for the effect of forebody and aftbody flowfield such as shocks and to allow for potential plume-induced flow-separation, thermo-flowfield of trajectory points is computed. The computational methodology is based on a three-dimensional finite-difference, viscous flow, chemically reacting, pressure-base computational fluid dynamics formulation, and a three-dimensional, finite-volume, spectral-line based weighted-sum-of-gray-gases radiation absorption model computational heat transfer formulation. The predicted convective and radiative base-heat fluxes are presented.
Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.
2018-01-01
Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424
Erdman, Laura K.; D’Acremont, Valérie; Hayford, Kyla; Kilowoko, Mary; Kyungu, Esther; Hongoa, Philipina; Alamo, Leonor; Streiner, David L.; Genton, Blaise; Kain, Kevin C.
2015-01-01
Background Diagnosing pediatric pneumonia is challenging in low-resource settings. The World Health Organization (WHO) has defined primary end-point radiological pneumonia for use in epidemiological and vaccine studies. However, radiography requires expertise and is often inaccessible. We hypothesized that plasma biomarkers of inflammation and endothelial activation may be useful surrogates for end-point pneumonia, and may provide insight into its biological significance. Methods We studied children with WHO-defined clinical pneumonia (n = 155) within a prospective cohort of 1,005 consecutive febrile children presenting to Tanzanian outpatient clinics. Based on x-ray findings, participants were categorized as primary end-point pneumonia (n = 30), other infiltrates (n = 31), or normal chest x-ray (n = 94). Plasma levels of 7 host response biomarkers at presentation were measured by ELISA. Associations between biomarker levels and radiological findings were assessed by Kruskal-Wallis test and multivariable logistic regression. Biomarker ability to predict radiological findings was evaluated using receiver operating characteristic curve analysis and Classification and Regression Tree analysis. Results Compared to children with normal x-ray, children with end-point pneumonia had significantly higher C-reactive protein, procalcitonin and Chitinase 3-like-1, while those with other infiltrates had elevated procalcitonin and von Willebrand Factor and decreased soluble Tie-2 and endoglin. Clinical variables were not predictive of radiological findings. Classification and Regression Tree analysis generated multi-marker models with improved performance over single markers for discriminating between groups. A model based on C-reactive protein and Chitinase 3-like-1 discriminated between end-point pneumonia and non-end-point pneumonia with 93.3% sensitivity (95% confidence interval 76.5–98.8), 80.8% specificity (72.6–87.1), positive likelihood ratio 4.9 (3.4–7.1), negative likelihood ratio 0.083 (0.022–0.32), and misclassification rate 0.20 (standard error 0.038). Conclusions In Tanzanian children with WHO-defined clinical pneumonia, combinations of host biomarkers distinguished between end-point pneumonia, other infiltrates, and normal chest x-ray, whereas clinical variables did not. These findings generate pathophysiological hypotheses and may have potential research and clinical utility. PMID:26366571
[Procedural analysis of acid-base balance disorder: case serials in 4 patents].
Ma, Chunyuan; Wang, Guijie
2017-05-01
To establish the standardization process of acid-base balance analysis, analyze cases of acid-base balance disorder with the aid of acid-base balance coordinate graph. The acid-base balance theory were reviewed systematically on recent research progress, and the important concepts, definitions, formulas, parameters, regularity and inference in the analysis of acid-base balance were studied. The analysis of acid-base balance disordered processes and steps were figured. The application of acid-base balance coordinate graph in the cases was introduced. The method of "four parameters-four steps" analysis was put forward to analyze the acid-base balance disorders completely. "Four parameters" included pH, arterial partial pressure of carbon dioxide (PaCO 2 ), HCO 3 - and anion gap (AG). "Four steps" were outlined by following aspects: (1) according to the pH, PaCO 2 and HCO 3 - , the primary or main types of acid-base balance disorder was determined; (2) primary or main types of acid-base disorder were used to choose the appropriate compensation formula and to determine the presence of double mixed acid-base balance disorder; (3) the primary acid-base balance disorders were divided into two parts: respiratory acidosis or respiratory alkalosis, at the same time, the potential HCO 3 - should be calculated, the measured HCO 3 - should be replaced with potential HCO 3 - , to determine whether there were three mixed acid-base disorders; (4) based on the above analysis the data judged as the simple AG increased-metabolic acidosis was needed to be further analyzed. The ratio of ΔAG↑/ΔHCO 3 - ↓ was also needed to be calculated, to determine whether there was normal AG metabolic acidosis or metabolic alkalosis. In the clinical practice, PaCO 2 (as the abscissa) and HCO 3 - (as the ordinate) were used to establish a rectangular coordinate system, through origin (0, 0) and coordinate point (40, 24) could be a straight line, and all points on the straight line pH were equal to 7.40. The acid-base balance coordinate graph could be divided into seven areas by three straight lines [namely pH = 7.40 isoline, PaCO 2 = 40 mmHg (1 mmHg = 0.133 kPa) line and HCO 3 - = 24 mmol/L line]: main respiratory alkalosis area, main metabolic alkalosis area, respiratory + metabolic alkalosis area, main respiratory acidosis area, main metabolic acidosis area, respiratory + metabolic acidosis area and normal area. It was easier to determine the type of acid-base balance disorders by identifying the location of the (PaCO 2 , HCO 3 - ) or (PaCO 2 , potential HCO 3 - ) point on the acid-base balance coordinate graph. "Four parameters-four steps" method is systematic and comprehensive. At the same time, by using the acid-base balance coordinate graph, it is simpler to estimate the types of acid-base balance disorders. It is worthy of popularizing and generalizing.
Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R
2018-04-10
Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Hopkins, Richard S; Cook, Robert L; Striley, Catherine W
2016-01-01
Background Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. Objective The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Methods Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC’s change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package “bcp” version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. Results During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for Google was 85%. A low sensitivity of 50% was calculated for Twitter; a low PPV of 43% was found for Twitter also. Wikipedia had the lowest sensitivity of 33% and lowest PPV of 40%. Conclusions Of the 3 Web-based sources, Google had the best combination of sensitivity and PPV in detecting Bayesian change points in influenza-related data streams. Findings demonstrated that change points in Google, Twitter, and Wikipedia data occasionally aligned well with change points captured in CDC ILI data, yet these sources did not detect all changes in CDC data and should be further studied and developed. PMID:27765731
Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations
ERIC Educational Resources Information Center
O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John
2009-01-01
Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…
Modeling spatio-temporal wildfire ignition point patterns
Amanda S. Hering; Cynthia L. Bell; Marc G. Genton
2009-01-01
We analyze and model the structure of spatio-temporal wildfire ignitions in the St. Johns River Water Management District in northeastern Florida. Previous studies, based on the K-function and an assumption of homogeneity, have shown that wildfire events occur in clusters. We revisit this analysis based on an inhomogeneous K-...
Ground settlement monitoring based on temporarily coherent points between two SAR acquisitions
Zhang, L.; Ding, X.; Lu, Z.
2011-01-01
An InSAR analysis approach for identifying and extracting the temporarily coherent points (TCP) that exist between two SAR acquisitions and for determining motions of the TCP is presented for applications such as ground settlement monitoring. TCP are identified based on the spatial characteristics of the range and azimuth offsets of coherent radar scatterers. A method for coregistering TCP based on the offsets of TCP is given to reduce the coregistration errors at TCP. An improved phase unwrapping method based on the minimum cost flow (MCF) algorithm and local Delaunay triangulation is also proposed for sparse TCP data. The proposed algorithms are validated using a test site in Hong Kong. The test results show that the algorithms work satisfactorily for various ground features.
Development of paper-based electrochemical sensors for water quality monitoring
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Bezuidenhout, Petroné; Mbanjwa, Mesuli; Zheng, Haitao; Conning, Mariette; Palaniyandy, Nithyadharseni; Ozoemena, Kenneth; Land, Kevin
2016-02-01
We present a method for the development of paper-based electrochemical sensors for detection of heavy metals in water samples. Contaminated water leads to serious health problems and environmental issues. Paper is ideally suited for point-of-care testing, as it is low cost, disposable, and multi-functional. Initial sensor designs were manufactured on paper substrates using combinations of inkjet printing and screen printing technologies using silver and carbon inks. Bismuth onion-like carbon nanoparticle ink was manufactured and used as the active material of the sensor for both commercial and paper-based sensors, which were compared using standard electrochemical analysis techniques. The results highlight the potential of paper-based sensors to be used effectively for rapid water quality monitoring at the point-of-need.
An analysis of the Petri net based model of the human body iron homeostasis process.
Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek
2007-02-01
In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.
Ontological analysis of SNOMED CT.
Héja, Gergely; Surján, György; Varga, Péter
2008-10-27
SNOMED CT is the most comprehensive medical terminology. However, its use for intelligent services based on formal reasoning is questionable. The analysis of the structure of SNOMED CT is based on the formal top-level ontology DOLCE. The analysis revealed several ontological and knowledge-engineering errors, the most important are errors in the hierarchy (mostly from an ontological point of view, but also regarding medical aspects) and the mixing of subsumption relations with other types (mostly 'part of'). The found errors impede formal reasoning. The paper presents a possible way to correct these problems.
a New Approach for Subway Tunnel Deformation Monitoring: High-Resolution Terrestrial Laser Scanning
NASA Astrophysics Data System (ADS)
Li, J.; Wan, Y.; Gao, X.
2012-07-01
With the improvement of the accuracy and efficiency of laser scanning technology, high-resolution terrestrial laser scanning (TLS) technology can obtain high precise points-cloud and density distribution and can be applied to high-precision deformation monitoring of subway tunnels and high-speed railway bridges and other fields. In this paper, a new approach using a points-cloud segmentation method based on vectors of neighbor points and surface fitting method based on moving least squares was proposed and applied to subway tunnel deformation monitoring in Tianjin combined with a new high-resolution terrestrial laser scanner (Riegl VZ-400). There were three main procedures. Firstly, a points-cloud consisted of several scanning was registered by linearized iterative least squares approach to improve the accuracy of registration, and several control points were acquired by total stations (TS) and then adjusted. Secondly, the registered points-cloud was resampled and segmented based on vectors of neighbor points to select suitable points. Thirdly, the selected points were used to fit the subway tunnel surface with moving least squares algorithm. Then a series of parallel sections obtained from temporal series of fitting tunnel surfaces were compared to analysis the deformation. Finally, the results of the approach in z direction were compared with the fiber optical displacement sensor approach and the results in x, y directions were compared with TS respectively, and comparison results showed the accuracy errors of x, y, z directions were respectively about 1.5 mm, 2 mm, 1 mm. Therefore the new approach using high-resolution TLS can meet the demand of subway tunnel deformation monitoring.
The 3D Radiation Dose Analysis For Satellite
NASA Astrophysics Data System (ADS)
Cai, Zhenbo; Lin, Guocheng; Chen, Guozhen; Liu, Xia
2002-01-01
the earth. These particles come from the Van Allen Belt, Solar Cosmic Ray and Galaxy Cosmic Ray. They have different energy and flux, varying with time and space, and correlating with solar activity tightly. These particles interact with electrical components and materials used on satellites, producing various space radiation effects, which will damage satellite to some extent, or even affect its safety. orbit. Space energy particles inject into components and materials used on satellites, and generate radiation dose by depositing partial or entire energy in them through ionization, which causes their characteristic degradation or even failure. As a consequence, the analysis and protection for radiation dose has been paid more attention during satellite design and manufacture. Designers of satellites need to analyze accurately the space radiation dose while satellites are on orbit, and use the results as the basis for radiation protection designs and ground experiments for satellites. can be calculated, using the model of the trapped proton and the trapped electron in the Van Allen Belt (AE8 and AP8). This is the 1D radiation dose analysis for satellites. Obviously, the mass shielding from the outside space to the computed point in all directions is regarded as a simple sphere shell. The actual structure of satellites, however, is very complex. When energy particles are injecting into a given equipment inside satellite from outside space, they will travel across satellite structure, other equipment, the shell of the given equipment, and so on, which depends greatly on actual layout of satellite. This complex radiation shielding has two characteristics. One is that the shielding masses for the computed point are different in different injecting directions. The other is that for different computed points, the shielding conditions vary in all space directions. Therefore, it is very difficult to tell the differences described above using the 1D radiation analysis, and hence, it is too simple to guide satellite radiation protection and ground experiments only based on the 1D radiation analysis results. To comprehend the radiation dose status of satellite adequately, it's essential to perform 3D radiation analysis for satellites. using computer software. From this 3D layout, the satellite model can be simplified appropriately. First select the point to be analyzed in the simplified satellite model, and extend many lines to the outside space, which divides the 4 space into many corresponding small areas with a certain solid angle. Then the shielding masses through the satellite equipment and structures along each direction are calculated, resulting in the shielding mass distribution in all space directions based on the satellite layout. Finally, using the relationship between radiation dose and shielding thickness from the 1D analysis, calculate the radiation dose in each area represented by each line. After we obtain the radiation dose and its space distribution for the point of interest, the 3D satellite radiation analysis is completed. radiation analysis based on satellite 3D CAD layout has larger benefit for engineering applications than the 1D analysis based on the solid sphere shielding model. With the 3D model, the analysis of space environment and its effect is combined closely with actual satellite engineering. The 3D radiation analysis not only provides valuable engineering data for satellite radiation design and protection, but also provides possibility to apply new radiation protection approaches, which expands technology horizon and broadens ways for technology development.
Sensitive detection of point mutation by electrochemiluminescence and DNA ligase-based assay
NASA Astrophysics Data System (ADS)
Zhou, Huijuan; Wu, Baoyan
2008-12-01
The technology of single-base mutation detection plays an increasingly important role in diagnosis and prognosis of genetic-based diseases. Here we reported a new method for the analysis of point mutations in genomic DNA through the integration of allele-specific oligonucleotide ligation assay (OLA) with magnetic beads-based electrochemiluminescence (ECL) detection scheme. In this assay the tris(bipyridine) ruthenium (TBR) labeled probe and the biotinylated probe are designed to perfectly complementary to the mutant target, thus a ligation can be generated between those two probes by Taq DNA Ligase in the presence of mutant target. If there is an allele mismatch, the ligation does not take place. The ligation products are then captured onto streptavidin-coated paramagnetic beads, and detected by measuring the ECL signal of the TBR label. Results showed that the new method held a low detection limit down to 10 fmol and was successfully applied in the identification of point mutations from ASTC-α-1, PANC-1 and normal cell lines in codon 273 of TP53 oncogene. In summary, this method provides a sensitive, cost-effective and easy operation approach for point mutation detection.
A Bayesian Framework for Human Body Pose Tracking from Depth Image Sequences
Zhu, Youding; Fujimura, Kikuo
2010-01-01
This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach. PMID:22399933
Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald
2003-01-01
The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.
Benchmark Dose for Urinary Cadmium based on a Marker of Renal Dysfunction: A Meta-Analysis
Woo, Hae Dong; Chiu, Weihsueh A.; Jo, Seongil; Kim, Jeongseon
2015-01-01
Background Low doses of cadmium can cause adverse health effects. Benchmark dose (BMD) and the one-sided 95% lower confidence limit of BMD (BMDL) to derive points of departure for urinary cadmium exposure have been estimated in several previous studies, but the methods to derive BMD and the estimated BMDs differ. Objectives We aimed to find the associated factors that affect BMD calculation in the general population, and to estimate the summary BMD for urinary cadmium using reported BMDs. Methods A meta-regression was performed and the pooled BMD/BMDL was estimated using studies reporting a BMD and BMDL, weighted by sample size, that were calculated from individual data based on markers of renal dysfunction. Results BMDs were highly heterogeneous across studies. Meta-regression analysis showed that a significant predictor of BMD was the cut-off point which denotes an abnormal level. Using the 95th percentile as a cut off, BMD5/BMDL5 estimates for 5% benchmark responses (BMR) of β2-microglobulinuria (β2-MG) estimated was 6.18/4.88 μg/g creatinine in conventional quantal analysis and 3.56/3.13 μg/g creatinine in the hybrid approach, and BMD5/BMDL5 estimates for 5% BMR of N-acetyl-β-d-glucosaminidase (NAG) was 10.31/7.61 μg/g creatinine in quantal analysis and 3.21/2.24 g/g creatinine in the hybrid approach. However, the meta-regression showed that BMD and BMDL were significantly associated with the cut-off point, but BMD calculation method did not significantly affect the results. The urinary cadmium BMDL5 of β2-MG was 1.9 μg/g creatinine in the lowest cut-off point group. Conclusion The BMD was significantly associated with the cut-off point defining the abnormal level of renal dysfunction markers. PMID:25970611
Gurnani, Navin; van Deurzen, Derek F P; van den Bekerom, Michel P J
2017-10-01
Nontraumatic full-thickness rotator cuff tears are commonly initially treated conservatively. If conservative treatment fails, rotator cuff repair is a viable subsequent option. The objective of the present meta-analysis is to evaluate the shoulder-specific outcomes one year after arthroscopic or mini-open rotator cuff repair of nontraumatic rotator cuff tears. A literature search was conducted in PubMed and EMBASE within the period January 2000 to January 2017. All studies measuring the clinical outcome at 12 months after nontraumatic rotator cuff repair of full-thickness rotator cuff tears were listed. We included 16 randomized controlled trials that met our inclusion criteria with a total of 1.221 shoulders. At 12 months after rotator cuff repair, the mean Constant score had increased 29.5 points; the mean American Shoulder and Elbow Score score increased by 38.6 points; mean Simple Shoulder Test score was 5.6 points; mean University of California Los Angeles score improved by 13.0 points; and finally, mean Visual Analogue Scale score decreased by 4.1 points. Based on this meta-analysis, significant improvements in the shoulder-specific indices are observed 12 months after nontraumatic arthroscopic or mini-open rotator cuff repair.
Multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Youshan, E-mail: ysliu@mail.iggcas.ac.cn; Teng, Jiwen, E-mail: jwteng@mail.iggcas.ac.cn; Xu, Tao, E-mail: xutao@mail.iggcas.ac.cn
2017-05-01
The mass-lumped method avoids the cost of inverting the mass matrix and simultaneously maintains spatial accuracy by adopting additional interior integration points, known as cubature points. To date, such points are only known analytically in tensor domains, such as quadrilateral or hexahedral elements. Thus, the diagonal-mass-matrix spectral element method (SEM) in non-tensor domains always relies on numerically computed interpolation points or quadrature points. However, only the cubature points for degrees 1 to 6 are known, which is the reason that we have developed a p-norm-based optimization algorithm to obtain higher-order cubature points. In this way, we obtain and tabulate newmore » cubature points with all positive integration weights for degrees 7 to 9. The dispersion analysis illustrates that the dispersion relation determined from the new optimized cubature points is comparable to that of the mass and stiffness matrices obtained by exact integration. Simultaneously, the Lebesgue constant for the new optimized cubature points indicates its surprisingly good interpolation properties. As a result, such points provide both good interpolation properties and integration accuracy. The Courant–Friedrichs–Lewy (CFL) numbers are tabulated for the conventional Fekete-based triangular spectral element (TSEM), the TSEM with exact integration, and the optimized cubature-based TSEM (OTSEM). A complementary study demonstrates the spectral convergence of the OTSEM. A numerical example conducted on a half-space model demonstrates that the OTSEM improves the accuracy by approximately one order of magnitude compared to the conventional Fekete-based TSEM. In particular, the accuracy of the 7th-order OTSEM is even higher than that of the 14th-order Fekete-based TSEM. Furthermore, the OTSEM produces a result that can compete in accuracy with the quadrilateral SEM (QSEM). The high accuracy of the OTSEM is also tested with a non-flat topography model. In terms of computational efficiency, the OTSEM is more efficient than the Fekete-based TSEM, although it is slightly costlier than the QSEM when a comparable numerical accuracy is required. - Highlights: • Higher-order cubature points for degrees 7 to 9 are developed. • The effects of quadrature rule on the mass and stiffness matrices has been conducted. • The cubature points have always positive integration weights. • Freeing from the inversion of a wide bandwidth mass matrix. • The accuracy of the TSEM has been improved in about one order of magnitude.« less
Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C
2012-04-01
The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.
A miniaturised image based fluorescence detection system for point-of-care-testing of cocaine abuse
NASA Astrophysics Data System (ADS)
Walczak, Rafał; Krüger, Jan; Moynihan, Shane
2015-08-01
In this paper, we describe a miniaturised image-based fluorescence detection system and demonstrate its viability as a highly sensitive tool for point-of-care-analysis of drugs of abuse in human sweat with a focus on monitor individuals for drugs of abuse. Investigations of miniaturised and low power optoelectronic configurations and methodologies for real-time image analysis were successfully carried out. The miniaturised fluorescence detection system was validated against a reference detection system under controlled laboratory conditions by analysing spiked sweat samples in dip stick and then strip with sample pad. As a result of the validation studies, a 1 ng mL-1 limit of detection of cocaine in sweat and full agreement of test results with the reference detection system can be reported. Results of the investigations open the way towards a detection system that integrates a hand-held fluorescence reader and a wearable skinpatch, and which can collect and in situ analyse sweat for the presence of cocaine at any point for up to tenths hours.
Knowledge-Based Object Detection in Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Boochs, F.; Karmacharya, A.; Marbs, A.
2012-07-01
Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.
On stability of fixed points and chaos in fractional systems.
Edelman, Mark
2018-02-01
In this paper, we propose a method to calculate asymptotically period two sinks and define the range of stability of fixed points for a variety of discrete fractional systems of the order 0<α<2. The method is tested on various forms of fractional generalizations of the standard and logistic maps. Based on our analysis, we make a conjecture that chaos is impossible in the corresponding continuous fractional systems.
ERIC Educational Resources Information Center
Bohanon, Hank; Fenning, Pamela; Hicks, Kira; Weber, Stacey; Thier, Kimberly; Aikins, Brigit; Morrissey, Kelly; Briggs, Alissa; Bartucci, Gina; McArdle, Lauren; Hoeper, Lisa; Irvin, Larry
2012-01-01
The purpose of this case study was to expand the literature base regarding the application of high school schoolwide positive behavior support in an urban setting for practitioners and policymakers to address behavior issues. In addition, the study describes the use of the Change Point Test as a method for analyzing time series data that are…
STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS
2018-02-15
23 9 Ground truth creation based on marked building feature points in two different views 50 frames apart in...between just two views , each row in the current figure represents a similar assessment however between one camera and all other cameras within the dataset...BA4S. While Fig. 44 depicted the epipolar lines for the point correspondences between just two views , the current figure represents a similar
NASA Astrophysics Data System (ADS)
Khankari, Goutam; Karmakar, Sujit
2017-06-01
This paper proposes a comparative performance analysis based on 4-E (Energy, Exergy, Environment, and Economic) of a bottoming pure Ammonia (NH3) based Organic Rankine Cycle (ORC) and Ammonia-water (NH3-H2O) based Kalina Cycle System 11(KCS 11) for additional power generation through condenser waste heat recovery integrated with a conventional 500MWe Subcritical coal-fired thermal power plant. A typical high-ash Indian coal is used for the analysis. The flow-sheet computer programme `Cycle Tempo' is used to simulate both the cycles for thermodynamic performance analysis at different plant operating conditions. Thermodynamic analysis is done by varying different NH3 mass fraction in KCS11 and at different turbine inlet pressure in both ORC and KCS11. Results show that the optimum operating pressure of ORC and KCS11 with NH3 mass fraction of 0.90 are about 15 bar and 11.70 bar, respectively and more than 14 bar of operating pressure, the plant performance of ORC integrated power plant is higher than the KCS11 integrated power plant and the result is observed reverse below this pressure. The energy and exergy efficiencies of ORC cycle are higher than the KCS11 by about 0.903 % point and 16.605 % points, respectively under similar saturation vapour temperature at turbine inlet for both the cycles. Similarly, plant energy and exergy efficiencies of ORC based combined cycle power plant are increased by 0.460 % point and 0.420 % point, respectively over KCS11 based combined cycle power plant. Moreover, the reduction of CO2 emission in ORC based combined cycle is about 3.23 t/hr which is about 1.5 times higher than the KCS11 based combined cycle power plant. Exergy destruction of the evaporator in ORC decreases with increase in operating pressure due to decrease in temperature difference of heat exchanging fluids. Exergy destruction rate in the evaporator of ORC is higher than KCS11 when the operating pressure of ORC reduces below 14 bar. This happens due to variable boiling temperature of NH3-H2O binary mixture in KCS11 and resulting in less irreversibility during the process of heat transfer. Levelized Cost of Electricity (LCoE) generation and the cost of implementation of ORC integrated power plant is about Rs.1.767/- per kWh and Rs. 2.187/- per kg of fuel saved, respectively whereas, the LCoE for KCS11 based combined power plant is slightly less than the ORC based combined cycle power plant and estimated as about Rs.1.734 /- per kWh. The cost of implementation of KCS11 based combined cycle power plant is about Rs. 0.332/- per kg of fuel saved. Though the energy and exergy efficiencies of ORC is better than KCS11 but considering the huge investment for developing the combined cycle power plant based on ORC in comparison with KCS11 below the operating pressure of 14 bar, KCS11 is superior than NH3 based ORC.
NASA Technical Reports Server (NTRS)
Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.
A Student's Construction of Transformations of Functions in a Multiple Representational Environment.
ERIC Educational Resources Information Center
Borba, Marcelo C.; Confrey, Jere
1996-01-01
Reports on a case study of a 16-year-old student working on transformations of functions in a computer-based, multirepresentational environment. Presents an analysis of the work during the transition from the use of visualization and analysis of discrete points to the use of algebraic symbolism. (AIM)
Rihn, Jeffrey A; Radcliff, Kristen; Norvell, Daniel C; Eastlack, Robert; Phillips, Frank M; Berland, Daniel; Sherry, Ned; Freedman, Mitchell; Vaccaro, Alexander R
2017-06-01
A systematic review and network meta-analysis. To determine current treatment options of chronic low back pain (LBP) as defined by randomized controlled trials (RCTs) and to compare effectiveness of those treatments using a mixed-treatment comparison (MTC). It is important to provide an evidence-based assessment of the treatment options that exist for LBP. A systematic search of RCTs was conducted in MEDLINE and the Cochrane Collaboration Library from 1990 to 2014. From the selected studies, we extracted preoperative and postoperative ODI and VAS back pain scores, additional surgeries, and complications. Standard and network meta-analytic techniques were used. Twelve RCTs were included in the analysis: 5 total disk replacement (TDR) versus fusion; 1 TDR versus exercise and cognitive behavioral therapy (CBT); 5 fusion versus exercise and CBT; and 1 fusion versus physical therapy (PT). On the basis of MTC, with respect to ODI change scores, the pooled mean difference favoring fusion over exercise and CBT was 2.0 points (95% CI, -1.2 to 4.8). The pooled mean difference favoring TDR over exercise and CBT was 6.4 points (95% CI, 3.2-9.3). The pooled mean difference favoring fusion over PT was 8.8 points (95% CI, 4.1-13.6). The pooled mean differences favoring TDR over fusion was 4.4 points (95% CI, 2.37-6.63). For PT versus structured exercise with CBT, the pooled mean difference favoring exercise with CBT over PT was 6.8 points (95% CI, 1.5-12.8). For TDR versus PT, the pooled mean difference favoring TDR over PT was 13.2 points (95% CI, 8.0-18.4). Additional surgery rates were similar between treatment options. All 4 treatments provided some benefit to patients with chronic LBP. According to the MTC analysis, TDR may be the most effective treatment and PT the least effective treatment for chronic LBP. This review is based on a limited number of RCT studies and does not support any 1 treatment modality for all patients.
Voxel-Based Morphometry ALE meta-analysis of Bipolar Disorder
NASA Astrophysics Data System (ADS)
Magana, Omar; Laird, Robert
2012-03-01
A meta-analysis was performed independently to view the changes in gray matter (GM) on patients with Bipolar disorder (BP). The meta-analysis was conducted on a Talairach Space using GingerALE to determine the voxels and their permutation. In order to achieve the data acquisition, published experiments and similar research studies were uploaded onto the online Voxel-Based Morphometry database (VBM). By doing so, coordinates of activation locations were extracted from Bipolar disorder related journals utilizing Sleuth. Once the coordinates of given experiments were selected and imported to GingerALE, a Gaussian was performed on all foci points to create the concentration points of GM on BP patients. The results included volume reductions and variations of GM between Normal Healthy controls and Patients with Bipolar disorder. A significant amount of GM clusters were obtained in Normal Healthy controls over BP patients on the right precentral gyrus, right anterior cingulate, and the left inferior frontal gyrus. In future research, more published journals could be uploaded onto the database and another VBM meta-analysis could be performed including more activation coordinates or a variation of age groups.
Fan, Dong-Dong; Kuang, Yan-Hui; Dong, Li-Hua; Ye, Xiao; Chen, Liang-Mian; Zhang, Dong; Ma, Zhen-Shan; Wang, Jin-Yu; Zhu, Jing-Jing; Wang, Zhi-Min; Wang, De-Qin; Li, Chu-Yuan
2017-04-01
To optimize the purification process of gynostemma pentaphyllum saponins (GPS) based on "adjoint marker" online control technology with GPS as the testing index. UPLC-QTOF-MS technology was used for qualitative analysis. "Adjoint marker" online control results showed that the end point of load sample was that the UV absorbance of effluent liquid was equal to half of that of load sample solution, and the absorbance was basically stable when the end point was stable. In UPLC-QTOF-MS qualitative analysis, 16 saponins were identified from GPS, including 13 known gynostemma saponins and 3 new saponins. This optimized method was proved to be simple, scientific, reasonable, easy for online determination, real-time record, and can be better applied to the mass production and automation of production. The results of qualitative analysis indicated that the "adjoint marker" online control technology can well retain main efficacy components of medicinal materials, and provide analysis tools for the process control and quality traceability. Copyright© by the Chinese Pharmaceutical Association.
Integrating SAS and GIS software to improve habitat-use estimates from radiotelemetry data
Kenow, K.P.; Wright, R.G.; Samuel, M.D.; Rasmussen, P.W.
2001-01-01
Radiotelemetry has been used commonly to remotely determine habitat use by a variety of wildlife species. However, habitat misclassification can occur because the true location of a radiomarked animal can only be estimated. Analytical methods that provide improved estimates of habitat use from radiotelemetry location data using a subsampling approach have been proposed previously. We developed software, based on these methods, to conduct improved habitat-use analyses. A Statistical Analysis System (SAS)-executable file generates a random subsample of points from the error distribution of an estimated animal location and formats the output into ARC/INFO-compatible coordinate and attribute files. An associated ARC/INFO Arc Macro Language (AML) creates a coverage of the random points, determines the habitat type at each random point from an existing habitat coverage, sums the number of subsample points by habitat type for each location, and outputs tile results in ASCII format. The proportion and precision of habitat types used is calculated from the subsample of points generated for each radiotelemetry location. We illustrate the method and software by analysis of radiotelemetry data for a female wild turkey (Meleagris gallopavo).
Percolation analysis for cosmic web with discrete points
NASA Astrophysics Data System (ADS)
Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung
2018-01-01
Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.
Thermodynamic analysis of biofuels as fuels for high temperature fuel cells
NASA Astrophysics Data System (ADS)
Milewski, Jarosław; Bujalski, Wojciech; Lewandowski, Janusz
2011-11-01
Based on mathematical modeling and numerical simulations, applicativity of various biofuels on high temperature fuel cell performance are presented. Governing equations of high temperature fuel cell modeling are given. Adequate simulators of both solid oxide fuel cell (SOFC) and molten carbonate fuel cell (MCFC) have been done and described. Performance of these fuel cells with different biofuels is shown. Some characteristics are given and described. Advantages and disadvantages of various biofuels from the system performance point of view are pointed out. An analysis of various biofuels as potential fuels for SOFC and MCFC is presented. The results are compared with both methane and hydrogen as the reference fuels. The biofuels are characterized by both lower efficiency and lower fuel utilization factors compared with methane. The presented results are based on a 0D mathematical model in the design point calculation. The governing equations of the model are also presented. Technical and financial analysis of high temperature fuel cells (SOFC and MCFC) are shown. High temperature fuel cells can be fed by biofuels like: biogas, bioethanol, and biomethanol. Operational costs and possible incomes of those installation types were estimated and analyzed. A comparison against classic power generation units is shown. A basic indicator net present value (NPV) for projects was estimated and commented.
Thermodynamic analysis of biofuels as fuels for high temperature fuel cells
NASA Astrophysics Data System (ADS)
Milewski, Jarosław; Bujalski, Wojciech; Lewandowski, Janusz
2013-02-01
Based on mathematical modeling and numerical simulations, applicativity of various biofuels on high temperature fuel cell performance are presented. Governing equations of high temperature fuel cell modeling are given. Adequate simulators of both solid oxide fuel cell (SOFC) and molten carbonate fuel cell (MCFC) have been done and described. Performance of these fuel cells with different biofuels is shown. Some characteristics are given and described. Advantages and disadvantages of various biofuels from the system performance point of view are pointed out. An analysis of various biofuels as potential fuels for SOFC and MCFC is presented. The results are compared with both methane and hydrogen as the reference fuels. The biofuels are characterized by both lower efficiency and lower fuel utilization factors compared with methane. The presented results are based on a 0D mathematical model in the design point calculation. The governing equations of the model are also presented. Technical and financial analysis of high temperature fuel cells (SOFC and MCFC) are shown. High temperature fuel cells can be fed by biofuels like: biogas, bioethanol, and biomethanol. Operational costs and possible incomes of those installation types were estimated and analyzed. A comparison against classic power generation units is shown. A basic indicator net present value (NPV) for projects was estimated and commented.
NASA Astrophysics Data System (ADS)
Sun, Z.; Cao, Y. K.
2015-08-01
The paper focuses on the versatility of data processing workflows ranging from BIM-based survey to structural analysis and reverse modeling. In China nowadays, a large number of historic architecture are in need of restoration, reinforcement and renovation. But the architects are not prepared for the conversion from the booming AEC industry to architectural preservation. As surveyors working with architects in such projects, we have to develop efficient low-cost digital survey workflow robust to various types of architecture, and to process the captured data for architects. Although laser scanning yields high accuracy in architectural heritage documentation and the workflow is quite straightforward, the cost and portability hinder it from being used in projects where budget and efficiency are of prime concern. We integrate Structure from Motion techniques with UAV and total station in data acquisition. The captured data is processed for various purposes illustrated with three case studies: the first one is as-built BIM for a historic building based on registered point clouds according to Ground Control Points; The second one concerns structural analysis for a damaged bridge using Finite Element Analysis software; The last one relates to parametric automated feature extraction from captured point clouds for reverse modeling and fabrication.
A Voxel-Based Approach for Imaging Voids in Three-Dimensional Point Clouds
NASA Astrophysics Data System (ADS)
Salvaggio, Katie N.
Geographically accurate scene models have enormous potential beyond that of just simple visualizations in regard to automated scene generation. In recent years, thanks to ever increasing computational efficiencies, there has been significant growth in both the computer vision and photogrammetry communities pertaining to automatic scene reconstruction from multiple-view imagery. The result of these algorithms is a three-dimensional (3D) point cloud which can be used to derive a final model using surface reconstruction techniques. However, the fidelity of these point clouds has not been well studied, and voids often exist within the point cloud. Voids exist in texturally difficult areas, as well as areas where multiple views were not obtained during collection, constant occlusion existed due to collection angles or overlapping scene geometry, or in regions that failed to triangulate accurately. It may be possible to fill in small voids in the scene using surface reconstruction or hole-filling techniques, but this is not the case with larger more complex voids, and attempting to reconstruct them using only the knowledge of the incomplete point cloud is neither accurate nor aesthetically pleasing. A method is presented for identifying voids in point clouds by using a voxel-based approach to partition the 3D space. By using collection geometry and information derived from the point cloud, it is possible to detect unsampled voxels such that voids can be identified. This analysis takes into account the location of the camera and the 3D points themselves to capitalize on the idea of free space, such that voxels that lie on the ray between the camera and point are devoid of obstruction, as a clear line of sight is a necessary requirement for reconstruction. Using this approach, voxels are classified into three categories: occupied (contains points from the point cloud), free (rays from the camera to the point passed through the voxel), and unsampled (does not contain points and no rays passed through the area). Voids in the voxel space are manifested as unsampled voxels. A similar line-of-sight analysis can then be used to pinpoint locations at aircraft altitude at which the voids in the point clouds could theoretically be imaged. This work is based on the assumption that inclusion of more images of the void areas in the 3D reconstruction process will reduce the number of voids in the point cloud that were a result of lack of coverage. Voids resulting from texturally difficult areas will not benefit from more imagery in the reconstruction process, and thus are identified and removed prior to the determination of future potential imaging locations.
NASA Technical Reports Server (NTRS)
Tapia, Moiez A.
1993-01-01
The study of a comparative analysis of distinct multiplex and fault-tolerant configurations for a PLC-based safety system from a reliability point of view is presented. It considers simplex, duplex and fault-tolerant triple redundancy configurations. The standby unit in case of a duplex configuration has a failure rate which is k times the failure rate of the standby unit, the value of k varying from 0 to 1. For distinct values of MTTR and MTTF of the main unit, MTBF and availability for these configurations are calculated. The effect of duplexing only the PLC module or only the sensors and the actuators module, on the MTBF of the configuration, is also presented. The results are summarized and merits and demerits of various configurations under distinct environments are discussed.
NASA Technical Reports Server (NTRS)
Nichols, J. D.; Gialdini, M.; Jaakkola, S.
1974-01-01
A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.
Smartphone-Based Accurate Analysis of Retinal Vasculature towards Point-of-Care Diagnostics
Xu, Xiayu; Ding, Wenxiang; Wang, Xuemin; Cao, Ruofan; Zhang, Maiye; Lv, Peilin; Xu, Feng
2016-01-01
Retinal vasculature analysis is important for the early diagnostics of various eye and systemic diseases, making it a potentially useful biomarker, especially for resource-limited regions and countries. Here we developed a smartphone-based retinal image analysis system for point-of-care diagnostics that is able to load a fundus image, segment retinal vessels, analyze individual vessel width, and store or uplink results. The proposed system was not only evaluated on widely used public databases and compared with the state-of-the-art methods, but also validated on clinical images directly acquired with a smartphone. An Android app is also developed to facilitate on-site application of the proposed methods. Both visual assessment and quantitative assessment showed that the proposed methods achieved comparable results to the state-of-the-art methods that require high-standard workstations. The proposed system holds great potential for the early diagnostics of various diseases, such as diabetic retinopathy, for resource-limited regions and countries. PMID:27698369
Evaluation of the leap motion controller as a new contact-free pointing device.
Bachmann, Daniel; Weichert, Frank; Rinkenauer, Gerhard
2014-12-24
This paper presents a Fitts' law-based analysis of the user's performance in selection tasks with the Leap Motion Controller compared with a standard mouse device. The Leap Motion Controller (LMC) is a new contact-free input system for gesture-based human-computer interaction with declared sub-millimeter accuracy. Up to this point, there has hardly been any systematic evaluation of this new system available. With an error rate of 7.8% for the LMC and 2.8% for the mouse device, movement times twice as large as for a mouse device and high overall effort ratings, the Leap Motion Controller's performance as an input device for everyday generic computer pointing tasks is rather limited, at least with regard to the selection recognition provided by the LMC.
Evaluation of the Leap Motion Controller as a New Contact-Free Pointing Device
Bachmann, Daniel; Weichert, Frank; Rinkenauer, Gerhard
2015-01-01
This paper presents a Fitts' law-based analysis of the user's performance in selection tasks with the Leap Motion Controller compared with a standard mouse device. The Leap Motion Controller (LMC) is a new contact-free input system for gesture-based human-computer interaction with declared sub-millimeter accuracy. Up to this point, there has hardly been any systematic evaluation of this new system available. With an error rate of 7.8 % for the LMC and 2.8% for the mouse device, movement times twice as large as for a mouse device and high overall effort ratings, the Leap Motion Controller's performance as an input device for everyday generic computer pointing tasks is rather limited, at least with regard to the selection recognition provided by the LMC. PMID:25609043
A study of the response of nonlinear springs
NASA Technical Reports Server (NTRS)
Hyer, M. W.; Knott, T. W.; Johnson, E. R.
1991-01-01
The various phases to developing a methodology for studying the response of a spring-reinforced arch subjected to a point load are discussed. The arch is simply supported at its ends with both the spring and the point load assumed to be at midspan. The spring is present to off-set the typical snap through behavior normally associated with arches, and to provide a structure that responds with constant resistance over a finite displacement. The various phases discussed consist of the following: (1) development of the closed-form solution for the shallow arch case; (2) development of a finite difference analysis to study (shallow) arches; and (3) development of a finite element analysis for studying more general shallow and nonshallow arches. The two numerical analyses rely on a continuation scheme to move the solution past limit points, and to move onto bifurcated paths, both characteristics being common to the arch problem. An eigenvalue method is used for a continuation scheme. The finite difference analysis is based on a mixed formulation (force and displacement variables) of the governing equations. The governing equations for the mixed formulation are in first order form, making the finite difference implementation convenient. However, the mixed formulation is not well-suited for the eigenvalue continuation scheme. This provided the motivation for the displacement based finite element analysis. Both the finite difference and the finite element analyses are compared with the closed form shallow arch solution. Agreement is excellent, except for the potential problems with the finite difference analysis and the continuation scheme. Agreement between the finite element analysis and another investigator's numerical analysis for deep arches is also good.
ERIC Educational Resources Information Center
Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan Luis
2017-01-01
The present study, based on the construct comparability approach, performs a comparative analysis of general points average for seven courses, using exploratory factor analysis (EFA) and the Partial Credit model (PCM) with a sample of 1398 student subjects (M = 12.5, SD = 0.67) from 8 schools in the province of Alicante (Spain). EFA confirmed a…
Polydimethylsiloxane-based Self healing Composite and Coating Materials
2006-01-01
TGA thermogravimetric analysis TDCB tapered double cantilever beam RH relative humidity DMDN-Sn dimethyldineodacanoate tin DBBE-Sn di-n-butyl bis(2...properties of micro-capsules by thermogravimetric analysis (TGA). As shown in figure 2.17, no weight change occurred up to the boiling point of...Elemental analysis of separated prepolymer phase and control samples. ..............24 Table 2.4: The size values of phase separated PDMS droplets
Effects of point mutations on the thermostability of B. subtilis lipase: investigating nonadditivity
NASA Astrophysics Data System (ADS)
Singh, Bipin; Bulusu, Gopalakrishnan; Mitra, Abhijit
2016-10-01
Molecular level understanding of mutational effects on stability and activity of enzymes is challenging particularly when several point mutations are incorporated during the directed evolution experiments. In our earlier study, we have suggested the lack of consistency in the effect of point mutations incorporated during the initial generations of directed evolution experiments, towards conformational stabilization of B. subtilis lipase mutants of later generations. Here, we report that the cumulative point mutations incorporated in mutants 2M (with two point mutations) to 6M (with six point mutations) possibly do not retain their original stabilizing nature in the most thermostable 12M mutant (with 12 point mutations). We have carried out MD simulations using structures incorporating reversal of different sets of point mutations to assess their effect on the conformational stability and activity of 12M. Our analysis has revealed that reversal of certain point mutations in 12M had little effect on its conformational stability, suggesting that these mutations were probably inconsequential towards the thermostability of the 12M mutant. Interestingly these mutations involved evolutionarily conserved residues. On the other hand, some of the other point mutations incorporated in nonconserved regions, appeared to contribute significantly towards the conformational stability and/or activity of 12M. Based on the analysis of dynamics of in silico mutants generated using the consensus sequence, we identified experimentally verifiable residue positions to further increase the conformational stability and activity of the 12M mutant.
Singh, Bipin; Bulusu, Gopalakrishnan; Mitra, Abhijit
2016-10-01
Molecular level understanding of mutational effects on stability and activity of enzymes is challenging particularly when several point mutations are incorporated during the directed evolution experiments. In our earlier study, we have suggested the lack of consistency in the effect of point mutations incorporated during the initial generations of directed evolution experiments, towards conformational stabilization of B. subtilis lipase mutants of later generations. Here, we report that the cumulative point mutations incorporated in mutants 2M (with two point mutations) to 6M (with six point mutations) possibly do not retain their original stabilizing nature in the most thermostable 12M mutant (with 12 point mutations). We have carried out MD simulations using structures incorporating reversal of different sets of point mutations to assess their effect on the conformational stability and activity of 12M. Our analysis has revealed that reversal of certain point mutations in 12M had little effect on its conformational stability, suggesting that these mutations were probably inconsequential towards the thermostability of the 12M mutant. Interestingly these mutations involved evolutionarily conserved residues. On the other hand, some of the other point mutations incorporated in nonconserved regions, appeared to contribute significantly towards the conformational stability and/or activity of 12M. Based on the analysis of dynamics of in silico mutants generated using the consensus sequence, we identified experimentally verifiable residue positions to further increase the conformational stability and activity of the 12M mutant.
NASA Astrophysics Data System (ADS)
Yao, W.; Polewski, P.; Krzystek, P.
2017-09-01
In this paper, a labelling method for the semantic analysis of ultra-high point density MLS data (up to 4000 points/m2) in urban road corridors is developed based on combining a conditional random field (CRF) for the context-based classification of 3D point clouds with shape priors. The CRF uses a Random Forest (RF) for generating the unary potentials of nodes and a variant of the contrastsensitive Potts model for the pair-wise potentials of node edges. The foundations of the classification are various geometric features derived by means of co-variance matrices and local accumulation map of spatial coordinates based on local neighbourhoods. Meanwhile, in order to cope with the ultra-high point density, a plane-based region growing method combined with a rule-based classifier is applied to first fix semantic labels for man-made objects. Once such kind of points that usually account for majority of entire data amount are pre-labeled; the CRF classifier can be solved by optimizing the discriminative probability for nodes within a subgraph structure excluded from pre-labeled nodes. The process can be viewed as an evidence fusion step inferring a degree of belief for point labelling from different sources. The MLS data used for this study were acquired by vehicle-borne Z+F phase-based laser scanner measurement, which permits the generation of a point cloud with an ultra-high sampling rate and accuracy. The test sites are parts of Munich City which is assumed to consist of seven object classes including impervious surfaces, tree, building roof/facade, low vegetation, vehicle and pole. The competitive classification performance can be explained by the diverse factors: e.g. the above ground height highlights the vertical dimension of houses, trees even cars, but also attributed to decision-level fusion of graph-based contextual classification approach with shape priors. The use of context-based classification methods mainly contributed to smoothing of labelling by removing outliers and the improvement in underrepresented object classes. In addition, the routine operation of a context-based classification for such high density MLS data becomes much more efficient being comparable to non-contextual classification schemes.
Thierry, Alain R
2016-01-01
Circulating cell-free DNA (cfDNA) is a valuable source of tumor material available with a simple blood sampling enabling a noninvasive quantitative and qualitative analysis of the tumor genome. cfDNA is released by tumor cells and exhibits the genetic and epigenetic alterations of the tumor of origin. Circulating cell-free DNA (cfDNA) analysis constitutes a hopeful approach to provide a noninvasive tumor molecular test for cancer patients. Based upon basic research on the origin and structure of cfDNA, new information on circulating cell-free DNA (cfDNA) structure, and specific determination of cfDNA fragmentation and size, we revisited Q-PCR-based method and recently developed a the allele-specific-Q-PCR-based method with blocker (termed as Intplex) which is the first multiplexed test for cfDNA. This technique, named Intplex(®) and based on a refined Q-PCR method, derived from critical observations made on the specific structure and size of cfDNA. It enables the simultaneous determination of five parameters: the cfDNA total concentration, the presence of a previously known point mutation, the mutant (tumor) cfDNA concentration (ctDNA), the proportion of mutant cfDNA, and the cfDNA fragmentation index. Intplex(®) has enabled the first clinical validation of ctDNA analysis in oncology by detecting KRAS and BRAF point mutations in mCRC patients and has demonstrated that a blood test could replace tumor section analysis for the detection of KRAS and BRAF mutations. The Intplex(®) test can be adapted to all mutations, genes, or cancers and enables rapid, highly sensitive, cost-effective, and repetitive analysis. As regards to the determination of mutations on cfDNA Intplex(®) is limited to the mutational status of known hotspot mutation; it is a "targeted approach." However, it offers the opportunity in detecting quantitatively and dynamically mutation and could constitute a noninvasive attractive tool potentially allowing diagnosis, prognosis, theranostics, therapeutic monitoring, and follow-up of cancer patients expanding the scope of personalized cancer medicine.
Pin routability and pin access analysis on standard cells for layout optimization
NASA Astrophysics Data System (ADS)
Chen, Jian; Wang, Jun; Zhu, ChengYu; Xu, Wei; Li, Shuai; Lin, Eason; Ou, Odie; Lai, Ya-Chieh; Qu, Shengrui
2018-03-01
At advanced process nodes, especially at sub-28nm technology, pin accessibility and routability of standard cells has become one of the most challenging design issues due to the limited router tracks and the increased pin density. If this issue can't be found and resolved during the cell design stage, the pin access problem will be very difficult to be fixed in implementation stage and will make the low efficiency for routing. In this paper, we will introduce a holistic approach for the pin accessibility scoring and routability analysis. For accessibility, the systematic calculator which assigns score for each pin will search the available access points, consider the surrounded router layers, basic design rule and allowed via geometry. Based on the score, the "bad" pins can be found and modified. On pin routability analysis, critical pin points (placing via on this point would lead to failed via insertion) will be searched out for either layout optimization guide or set as OBS for via insertion blocking. By using this pin routability and pin access analysis flow, we are able to improve the library quality and performance.
Detection of Subtle Cognitive Changes after mTBI Using a Novel Tablet-Based Task.
Fischer, Tara D; Red, Stuart D; Chuang, Alice Z; Jones, Elizabeth B; McCarthy, James J; Patel, Saumil S; Sereno, Anne B
2016-07-01
This study examined the potential for novel tablet-based tasks, modeled after eye tracking techniques, to detect subtle sensorimotor and cognitive deficits after mild traumatic brain injury (mTBI). Specifically, we examined whether performance on these tablet-based tasks (Pro-point and Anti-point) was able to correctly categorize concussed versus non-concussed participants, compared with performance on other standardized tests for concussion. Patients admitted to the emergency department with mTBI were tested on the Pro-point and Anti-point tasks, a current standard cognitive screening test (i.e., the Standard Assessment of Concussion [SAC]), and another eye movement-based tablet test, the King-Devick(®) (KD). Within hours after injury, mTBI patients showed significant slowing in response times, compared with both orthopedic and age-matched control groups, in the Pro-point task, demonstrating deficits in sensorimotor function. Mild TBI patients also showed significant slowing, compared with both control groups, on the Anti-point task, even when controlling for sensorimotor slowing, indicating deficits in cognitive function. Performance on the SAC test revealed similar deficits of cognitive function in the mTBI group, compared with the age-matched control group; however, the KD test showed no evidence of cognitive slowing in mTBI patients, compared with either control group. Further, measuring the sensitivity and specificity of these tasks to accurately predict mTBI with receiver operating characteristic analysis indicated that the Anti-point and Pro-point tasks reached excellent levels of accuracy and fared better than current standardized tools for assessment of concussion. Our findings suggest that these rapid tablet-based tasks are able to reliably detect and measure functional impairment in cognitive and sensorimotor control within hours after mTBI. These tasks may provide a more sensitive diagnostic measure for functional deficits that could prove key to earlier detection of concussion, evaluation of interventions, or even prediction of persistent symptoms.
Detection of Subtle Cognitive Changes after mTBI Using a Novel Tablet-Based Task
Red, Stuart D.; Chuang, Alice Z.; Jones, Elizabeth B.; McCarthy, James J.; Patel, Saumil S.; Sereno, Anne B.
2016-01-01
Abstract This study examined the potential for novel tablet-based tasks, modeled after eye tracking techniques, to detect subtle sensorimotor and cognitive deficits after mild traumatic brain injury (mTBI). Specifically, we examined whether performance on these tablet-based tasks (Pro-point and Anti-point) was able to correctly categorize concussed versus non-concussed participants, compared with performance on other standardized tests for concussion. Patients admitted to the emergency department with mTBI were tested on the Pro-point and Anti-point tasks, a current standard cognitive screening test (i.e., the Standard Assessment of Concussion [SAC]), and another eye movement–based tablet test, the King-Devick® (KD). Within hours after injury, mTBI patients showed significant slowing in response times, compared with both orthopedic and age-matched control groups, in the Pro-point task, demonstrating deficits in sensorimotor function. Mild TBI patients also showed significant slowing, compared with both control groups, on the Anti-point task, even when controlling for sensorimotor slowing, indicating deficits in cognitive function. Performance on the SAC test revealed similar deficits of cognitive function in the mTBI group, compared with the age-matched control group; however, the KD test showed no evidence of cognitive slowing in mTBI patients, compared with either control group. Further, measuring the sensitivity and specificity of these tasks to accurately predict mTBI with receiver operating characteristic analysis indicated that the Anti-point and Pro-point tasks reached excellent levels of accuracy and fared better than current standardized tools for assessment of concussion. Our findings suggest that these rapid tablet-based tasks are able to reliably detect and measure functional impairment in cognitive and sensorimotor control within hours after mTBI. These tasks may provide a more sensitive diagnostic measure for functional deficits that could prove key to earlier detection of concussion, evaluation of interventions, or even prediction of persistent symptoms. PMID:26398492
Analysis of Multicomponent Adsorption Close to a Dew Point.
Shapiro; Stenby
1998-10-15
We develop the potential theory of multicomponent adsorption close to a dew point. The approach is based on an asymptotic adsorption equation (AAE) which is valid in a vicinity of the dew point. By this equation the thickness of the liquid film is expressed through thermodynamic characteristics of the bulk phase. The AAE makes it possible to study adsorption in the regions of both the normal and the retrograde condensation. A simple correlation of the Kelvin radius for capillary condensation and the thickness of the adsorbed film is established. Numerical testing shows good agreement between the AAE and the direct calculations, even if the mixture is not close to a dew point. Copyright 1998 Academic Press.
Parrado-Hernández, Emilio; Gómez-Sánchez, Eduardo; Dimitriadis, Yannis A
2003-09-01
An evaluation of distributed learning as a means to attenuate the category proliferation problem in Fuzzy ARTMAP based neural systems is carried out, from both qualitative and quantitative points of view. The study involves two original winner-take-all (WTA) architectures, Fuzzy ARTMAP and FasArt, and their distributed versions, dARTMAP and dFasArt. A qualitative analysis of the distributed learning properties of dARTMAP is made, focusing on the new elements introduced to endow Fuzzy ARTMAP with distributed learning. In addition, a quantitative study on a selected set of classification problems points out that problems have to present certain features in their output classes in order to noticeably reduce the number of recruited categories and achieve an acceptable classification accuracy. As part of this analysis, distributed learning was successfully adapted to a member of the Fuzzy ARTMAP family, FasArt, and similar procedures can be used to extend distributed learning capabilities to other Fuzzy ARTMAP based systems.
Quasi-Sun-Pointing of Spacecraft Using Radiation Pressure
NASA Technical Reports Server (NTRS)
Spilker, Thomas
2003-01-01
A report proposes a method of utilizing solar-radiation pressure to keep the axis of rotation of a small spin-stabilized spacecraft pointed approximately (typically, within an angle of 10 deg to 20 deg) toward the Sun. Axisymmetry is not required. Simple tilted planar vanes would be attached to the outer surface of the body, so that the resulting spacecraft would vaguely resemble a rotary fan, windmill, or propeller. The vanes would be painted black for absorption of Solar radiation. A theoretical analysis based on principles of geometric optics and mechanics has shown that torques produced by Solar-radiation pressure would cause the axis of rotation to precess toward Sun-pointing. The required vane size would be a function of the angular momentum of the spacecraft and the maximum acceptable angular deviation from Sun-pointing. The analysis also shows that the torques produced by the vanes would slowly despin the spacecraft -- an effect that could be counteracted by adding specularly reflecting "spin-up" vanes.
Alternative Methods for Estimating Plane Parameters Based on a Point Cloud
NASA Astrophysics Data System (ADS)
Stryczek, Roman
2017-12-01
Non-contact measurement techniques carried out using triangulation optical sensors are increasingly popular in measurements with the use of industrial robots directly on production lines. The result of such measurements is often a cloud of measurement points that is characterized by considerable measuring noise, presence of a number of points that differ from the reference model, and excessive errors that must be eliminated from the analysis. To obtain vector information points contained in the cloud that describe reference models, the data obtained during a measurement should be subjected to appropriate processing operations. The present paperwork presents an analysis of suitability of methods known as RANdom Sample Consensus (RANSAC), Monte Carlo Method (MCM), and Particle Swarm Optimization (PSO) for the extraction of the reference model. The effectiveness of the tested methods is illustrated by examples of measurement of the height of an object and the angle of a plane, which were made on the basis of experiments carried out at workshop conditions.
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator)
1984-01-01
The geometric quality of TM film and digital products is evaluated by making selective photomeasurements and by measuring the coordinates of known features on both the TM products and map products. These paired observations are related using a standard linear least squares regression approach. Using regression equations and coefficients developed from 225 (TM film product) and 20 (TM digital product) control points, map coordinates of test points are predicted. The residual error vectors and analysis of variance (ANOVA) were performed on the east and north residual using nine image segments (blocks) as treatments. Based on the root mean square error of the 223 (TM film product) and 22 (TM digital product) test points, users of TM data expect the planimetric accuracy of mapped points to be within 91 meters and within 117 meters for the film products, and to be within 12 meters and within 14 meters for the digital products.
A Mobile Acoustic Subsurface Sensing (MASS) System for Rapid Roadway Assessment
Lu, Yifeng; Zhang, Yi; Cao, Yinghong; McDaniel, J. Gregory; Wang, Ming L.
2013-01-01
Surface waves are commonly used for vibration-based nondestructive testing for infrastructure. Spectral Analysis of Surface Waves (SASW) has been used to detect subsurface properties for geologic inspections. Recently, efforts were made to scale down these subsurface detection approaches to see how they perform on small-scale structures such as concrete slabs and pavements. Additional efforts have been made to replace the traditional surface-mounted transducers with non-contact acoustic transducers. Though some success has been achieved, most of these new approaches are inefficient because they require point-to-point measurements or off-line signal analysis. This article introduces a Mobile Acoustic Subsurface Sensing system as MASS, which is an improved surface wave based implementation for measuring the subsurface profile of roadways. The compact MASS system is a 3-wheeled cart outfitted with an electromagnetic impact source, distance register, non-contact acoustic sensors and data acquisition/processing equipment. The key advantage of the MASS system is the capability to collect measurements continuously at walking speed in an automatic way. The fast scan and real-time analysis advantages are based upon the non-contact acoustic sensing and fast air-coupled surface wave analysis program. This integration of hardware and software makes the MASS system an efficient mobile prototype for the field test. PMID:23698266
Approximation methods for combined thermal/structural design
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Shore, C. P.
1979-01-01
Two approximation concepts for combined thermal/structural design are evaluated. The first concept is an approximate thermal analysis based on the first derivatives of structural temperatures with respect to design variables. Two commonly used first-order Taylor series expansions are examined. The direct and reciprocal expansions are special members of a general family of approximations, and for some conditions other members of that family of approximations are more accurate. Several examples are used to compare the accuracy of the different expansions. The second approximation concept is the use of critical time points for combined thermal and stress analyses of structures with transient loading conditions. Significant time savings are realized by identifying critical time points and performing the stress analysis for those points only. The design of an insulated panel which is exposed to transient heating conditions is discussed.
Photogrammetric analysis of horizon panoramas: The Pathfinder landing site in Viking orbiter images
Oberst, J.; Jaumann, R.; Zeitler, W.; Hauber, E.; Kuschel, M.; Parker, T.; Golombek, M.; Malin, M.; Soderblom, L.
1999-01-01
Tiepoint measurements, block adjustment techniques, and sunrise/sunset pictures were used to obtain precise pointing data with respect to north for a set of 33 IMP horizon images. Azimuth angles for five prominent topographic features seen at the horizon were measured and correlated with locations of these features in Viking orbiter images. Based on this analysis, the Pathfinder line/sample coordinates in two raw Viking images were determined with approximate errors of 1 pixel, or 40 m. Identification of the Pathfinder location in orbit imagery yields geological context for surface studies of the landing site. Furthermore, the precise determination of coordinates in images together with the known planet-fixed coordinates of the lander make the Pathfinder landing site the most important anchor point in current control point networks of Mars. Copyright 1999 by the American Geophysical Union.
An integrated approach to assess heavy metal source apportionment in peri-urban agricultural soils.
Huang, Ying; Li, Tingqiang; Wu, Chengxian; He, Zhenli; Japenga, Jan; Deng, Meihua; Yang, Xiaoe
2015-12-15
Three techniques (Isotope Ratio Analysis, GIS mapping, and Multivariate Statistical Analysis) were integrated to assess heavy metal pollution and source apportionment in peri-urban agricultural soils. The soils in the study area were moderately polluted with cadmium (Cd) and mercury (Hg), lightly polluted with lead (Pb), and chromium (Cr). GIS Mapping suggested Cd pollution originates from point sources, whereas Hg, Pb, Cr could be traced back to both point and non-point sources. Principal component analysis (PCA) indicated aluminum (Al), manganese (Mn), nickel (Ni) were mainly inherited from natural sources, while Hg, Pb, and Cd were associated with two different kinds of anthropogenic sources. Cluster analysis (CA) further identified fertilizers, waste water, industrial solid wastes, road dust, and atmospheric deposition as potential sources. Based on isotope ratio analysis (IRA) organic fertilizers and road dusts accounted for 74-100% and 0-24% of the total Hg input, while road dusts and solid wastes contributed for 0-80% and 19-100% of the Pb input. This study provides a reliable approach for heavy metal source apportionment in this particular peri-urban area, with a clear potential for future application in other regions. Copyright © 2015 Elsevier B.V. All rights reserved.
Statistical 3D shape analysis of gender differences in lateral ventricles
NASA Astrophysics Data System (ADS)
He, Qing; Karpman, Dmitriy; Duan, Ye
2010-03-01
This paper aims at analyzing gender differences in the 3D shapes of lateral ventricles, which will provide reference for the analysis of brain abnormalities related to neurological disorders. Previous studies mostly focused on volume analysis, and the main challenge in shape analysis is the required step of establishing shape correspondence among individual shapes. We developed a simple and efficient method based on anatomical landmarks. 14 females and 10 males with matching ages participated in this study. 3D ventricle models were segmented from MR images by a semiautomatic method. Six anatomically meaningful landmarks were identified by detecting the maximum curvature point in a small neighborhood of a manually clicked point on the 3D model. Thin-plate spline was used to transform a randomly selected template shape to each of the rest shape instances, and the point correspondence was established according to Euclidean distance and surface normal. All shapes were spatially aligned by Generalized Procrustes Analysis. Hotelling T2 twosample metric was used to compare the ventricle shapes between males and females, and False Discovery Rate estimation was used to correct for the multiple comparison. The results revealed significant differences in the anterior horn of the right ventricle.
DOT National Transportation Integrated Search
1978-03-01
The role of suppliers to the auto industry in promoting innovation is explored. Thirty-two innovations are investigated, and information on their success/failure, area of impact, and key decision points is generated. Based on this data base, barriers...
Investigating Learning through Work: What the Literature Says. Support Document
ERIC Educational Resources Information Center
Chappell, Clive; Hawke, Geof
2008-01-01
This Support Document was produced by the authors based on their research for the report, "Investigating Learning through Work: The Development of the 'Provider Learning Environment Scale'" (ED503392). While couched in very different terms, the analysis presented in this report points to a substantial overlap in the conceptual bases that…
Performance Based Logistics... What’s Stopping Us
2016-03-01
performance-based life cycle product support, where outcomes are acquired through performance-based arrangements that deliver Warfighter requirements and...correlates to the acquisition life cycle framework: spend the time and effort to identify and lock in the PBL requirements; conduct an analysis to...PDASD[L&MR]) on PBL strategies. The study, Project Proof Point: A Study to Determine the Impact of Performance Based Logistics (PBL) on Life Cycle
Percolation analysis for cosmic web with discrete points
NASA Astrophysics Data System (ADS)
Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung
2016-03-01
Percolation analysis has long been used to quantify the connectivity of the cosmic web. Unlike most of the previous works using density field on grids, we have studied percolation analysis based on discrete points. Using a Friends-of-Friends (FoF) algorithm, we generate the S-bb relation, between the fractional mass of the largest connected group (S) and the FoF linking length (bb). We propose a new model, the Probability Cloud Cluster Expansion Theory (PCCET) to relate the S-bb relation with correlation functions. We show that the S-bb relation reflects a combination of all orders of correlation functions. We have studied the S-bb relation with simulation and find that the S-bb relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with Halo Abundance Matching (HAM), we have generated a mock galaxy catalogue. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalogue with the latest galaxy catalogue from SDSS DR12, we have found significant differences in their S-bb relations. This indicates that the mock catalogue cannot accurately recover higher order correlation functions than the two-point correlation function, which reveals the limit of HAM method.
Space crew radiation exposure analysis system based on a commercial stand-alone CAD system
NASA Technical Reports Server (NTRS)
Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.
1992-01-01
Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.
Fu, Yili; Gao, Wenpeng; Chen, Xiaoguang; Zhu, Minwei; Shen, Weigao; Wang, Shuguo
2010-01-01
The reference system based on the fourth ventricular landmarks (including the fastigial point and ventricular floor plane) is used in medical image analysis of the brain stem. The objective of this study was to develop a rapid, robust, and accurate method for the automatic identification of this reference system on T1-weighted magnetic resonance images. The fully automated method developed in this study consisted of four stages: preprocessing of the data set, expectation-maximization algorithm-based extraction of the fourth ventricle in the region of interest, a coarse-to-fine strategy for identifying the fastigial point, and localization of the base point. The method was evaluated on 27 Brain Web data sets qualitatively and 18 Internet Brain Segmentation Repository data sets and 30 clinical scans quantitatively. The results of qualitative evaluation indicated that the method was robust to rotation, landmark variation, noise, and inhomogeneity. The results of quantitative evaluation indicated that the method was able to identify the reference system with an accuracy of 0.7 +/- 0.2 mm for the fastigial point and 1.1 +/- 0.3 mm for the base point. It took <6 seconds for the method to identify the related landmarks on a personal computer with an Intel Core 2 6300 processor and 2 GB of random-access memory. The proposed method for the automatic identification of the reference system based on the fourth ventricular landmarks was shown to be rapid, robust, and accurate. The method has potentially utility in image registration and computer-aided surgery.
A Corner-Point-Grid-Based Voxelization Method for Complex Geological Structure Model with Folds
NASA Astrophysics Data System (ADS)
Chen, Qiyu; Mariethoz, Gregoire; Liu, Gang
2017-04-01
3D voxelization is the foundation of geological property modeling, and is also an effective approach to realize the 3D visualization of the heterogeneous attributes in geological structures. The corner-point grid is a representative data model among all voxel models, and is a structured grid type that is widely applied at present. When carrying out subdivision for complex geological structure model with folds, we should fully consider its structural morphology and bedding features to make the generated voxels keep its original morphology. And on the basis of which, they can depict the detailed bedding features and the spatial heterogeneity of the internal attributes. In order to solve the shortage of the existing technologies, this work puts forward a corner-point-grid-based voxelization method for complex geological structure model with folds. We have realized the fast conversion from the 3D geological structure model to the fine voxel model according to the rule of isocline in Ramsay's fold classification. In addition, the voxel model conforms to the spatial features of folds, pinch-out and other complex geological structures, and the voxels of the laminas inside a fold accords with the result of geological sedimentation and tectonic movement. This will provide a carrier and model foundation for the subsequent attribute assignment as well as the quantitative analysis and evaluation based on the spatial voxels. Ultimately, we use examples and the contrastive analysis between the examples and the Ramsay's description of isoclines to discuss the effectiveness and advantages of the method proposed in this work when dealing with the voxelization of 3D geologic structural model with folds based on corner-point grids.
NASA Astrophysics Data System (ADS)
Li, Na; Gong, Xingyu; Li, Hongan; Jia, Pengtao
2018-01-01
For faded relics, such as Terracotta Army, the 2D-3D registration between an optical camera and point cloud model is an important part for color texture reconstruction and further applications. This paper proposes a nonuniform multiview color texture mapping for the image sequence and the three-dimensional (3D) model of point cloud collected by Handyscan3D. We first introduce nonuniform multiview calibration, including the explanation of its algorithm principle and the analysis of its advantages. We then establish transformation equations based on sift feature points for the multiview image sequence. At the same time, the selection of nonuniform multiview sift feature points is introduced in detail. Finally, the solving process of the collinear equations based on multiview perspective projection is given with three steps and the flowchart. In the experiment, this method is applied to the color reconstruction of the kneeling figurine, Tangsancai lady, and general figurine. These results demonstrate that the proposed method provides an effective support for the color reconstruction of the faded cultural relics and be able to improve the accuracy of 2D-3D registration between the image sequence and the point cloud model.
EOS-AM precision pointing verification
NASA Technical Reports Server (NTRS)
Throckmorton, A.; Braknis, E.; Bolek, J.
1993-01-01
The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.
NASA Astrophysics Data System (ADS)
Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Bellido, J. A.; Belov, K.; Belz, J. W.; Ben-Zvi, S. Y.; Bergman, D. R.; Boyer, J. H.; Burt, G. W.; Cao, Z.; Clay, R. W.; Connolly, B. M.; Dawson, B. R.; Deng, W.; Farrar, G. R.; Fedorova, Y.; Findlay, J.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kim, K.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Maestas, M. M.; Manago, N.; Mannel, E. J.; Marek, L. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M. D.; Sasaki, M.; Schnetzer, S. R.; Seman, M.; Simpson, K. M.; Sinnis, G.; Smith, J. D.; Snow, R.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, J. R.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.
2005-04-01
We present the results of a search for cosmic-ray point sources at energies in excess of 4.0×1019 eV in the combined data sets recorded by the Akeno Giant Air Shower Array and High Resolution Fly's Eye stereo experiments. The analysis is based on a maximum likelihood ratio test using the probability density function for each event rather than requiring an a priori choice of a fixed angular bin size. No statistically significant clustering of events consistent with a point source is found.
NASA Astrophysics Data System (ADS)
Ismail, Nurul Syuhada; Arifin, Norihan Md.; Bachok, Norfifah; Mahiddin, Norhasimah
2017-01-01
A numerical study is performed to evaluate the problem of stagnation - point flow towards a shrinking sheet with homogeneous - heterogeneous reaction effects. By using non-similar transformation, the governing equations be able to reduced to an ordinary differential equation. Then, results of the equations can be obtained numerically by shooting method with maple implementation. Based on the numerical results obtained, the velocity ratio parameter λ< 0, the dual solutions do exist. Then, the stability analysis is carried out to determine which solution is more stable between both of the solutions by bvp4c solver in Matlab.
Wavelets and molecular structure
NASA Astrophysics Data System (ADS)
Carson, Mike
1996-08-01
The wavelet method offers possibilities for display, editing, and topological comparison of proteins at a user-specified level of detail. Wavelets are a mathematical tool that first found application in signal processing. The multiresolution analysis of a signal via wavelets provides a hierarchical series of `best' lower-resolution approximations. B-spline ribbons model the protein fold, with one control point per residue. Wavelet analysis sets limits on the information required to define the winding of the backbone through space, suggesting a recognizable fold is generated from a number of points equal to 1/4 or less the number of residues. Wavelets applied to surfaces and volumes show promise in structure-based drug design.
Luo, Xiaoteng; Hsing, I-Ming
2009-10-01
Nucleic acid based analysis provides accurate differentiation among closely affiliated species and this species- and sequence-specific detection technique would be particularly useful for point-of-care (POC) testing for prevention and early detection of highly infectious and damaging diseases. Electrochemical (EC) detection and polymerase chain reaction (PCR) are two indispensable steps, in our view, in a nucleic acid based point-of-care testing device as the former, in comparison with the fluorescence counterpart, provides inherent advantages of detection sensitivity, device miniaturization and operation simplicity, and the latter offers an effective way to boost the amount of targets to a detectable quantity. In this mini-review, we will highlight some of the interesting investigations using the combined EC detection and PCR amplification approaches for end-point detection and real-time monitoring. The promise of current approaches and the direction for future investigations will be discussed. It would be our view that the synergistic effect of the combined EC-PCR steps in a portable device provides a promising detection technology platform that will be ready for point-of-care applications in the near future.
ERIC Educational Resources Information Center
Glazier, Samantha; Marano, Nadia; Eisen, Laura
2010-01-01
We describe how we use boiling-point trends of group IV-VII hydrides to introduce intermolecular forces in our first-year general chemistry classes. Starting with the idea that molecules in the liquid state are held together by some kind of force that must be overcome for boiling to take place, students use data analysis and critical reasoning to…
NASA Astrophysics Data System (ADS)
Saberi, Elaheh; Reza Hejazi, S.
2018-02-01
In the present paper, Lie point symmetries of the time-fractional generalized Hirota-Satsuma coupled KdV (HS-cKdV) system based on the Riemann-Liouville derivative are obtained. Using the derived Lie point symmetries, we obtain similarity reductions and conservation laws of the considered system. Finally, some analytic solutions are furnished by means of the invariant subspace method in the Caputo sense.
Dascălu, Cristina Gena; Antohe, Magda Ecaterina
2009-01-01
Based on the eigenvalues and the eigenvectors analysis, the principal component analysis has the purpose to identify the subspace of the main components from a set of parameters, which are enough to characterize the whole set of parameters. Interpreting the data for analysis as a cloud of points, we find through geometrical transformations the directions where the cloud's dispersion is maximal--the lines that pass through the cloud's center of weight and have a maximal density of points around them (by defining an appropriate criteria function and its minimization. This method can be successfully used in order to simplify the statistical analysis on questionnaires--because it helps us to select from a set of items only the most relevant ones, which cover the variations of the whole set of data. For instance, in the presented sample we started from a questionnaire with 28 items and, applying the principal component analysis we identified 7 principal components--or main items--fact that simplifies significantly the further data statistical analysis.
Yoon, Christina; Semitala, Fred C; Atuhumuza, Elly; Katende, Jane; Mwebe, Sandra; Asege, Lucy; Armstrong, Derek T; Andama, Alfred O; Dowdy, David W; Davis, J Luke; Huang, Laurence; Kamya, Moses; Cattamanchi, Adithya
2017-12-01
Symptom-based screening for tuberculosis is recommended for all people living with HIV. This recommendation results in unnecessary Xpert MTB/RIF testing in many individuals living in tuberculosis-endemic areas and thus poor implementation of intensified case finding and tuberculosis preventive therapy. Novel approaches to tuberculosis screening are needed to help achieve global targets for tuberculosis elimination. We assessed the performance of C-reactive protein (CRP) measured with a point-of-care assay as a screening tool for active pulmonary tuberculosis. For this prospective study, we enrolled adults (aged ≥18 years) living with HIV with CD4 cell count less than or equal to 350 cells per μL who were initiating antiretroviral therapy (ART) from two HIV/AIDS clinics in Uganda. CRP concentrations were measured at study entry with a point-of-care assay using whole blood obtained by fingerprick (concentration ≥10 mg/L defined as screen positive for tuberculosis). Sputum samples were collected for Xpert MTB/RIF testing and culture. We calculated the sensitivity and specificity of point-of-care CRP and WHO symptom-based screening in reference to culture results. We repeated the sensitivity analysis with Xpert MTB/RIF as the reference standard. Between July 8, 2013, and Dec 15, 2015, 1237 HIV-infected adults were enrolled and underwent point-of-care CRP testing. 60 (5%) patients with incomplete or contaminated cultures were excluded from the analysis. Of the remaining 1177 patients (median CD4 count 165 cells per μL [IQR 75-271]), 163 (14%) had culture-confirmed tuberculosis. Point-of-care CRP testing had 89% sensitivity (145 of 163, 95% CI 83-93) and 72% specificity (731 of 1014, 95% CI 69-75) for culture-confirmed tuberculosis. Compared with WHO symptom-based screening, point-of-care CRP testing had lower sensitivity (difference -7%, 95% CI -12 to -2; p=0·002) but substantially higher specificity (difference 58%, 95% CI 55 to 61; p<0·0001). When Xpert MTB/RIF results were used as the reference standard, sensitivity of point-of-care CRP and WHO symptom-based screening were similar (94% [79 of 84] vs 99% [83 of 84], respectively; difference -5%, 95% CI -12 to 2; p=0·10). The performance characteristics of CRP support its use as a tuberculosis screening test for people living with HIV with CD4 count less than or equal to 350 cells per μL who are initiating ART. HIV/AIDS programmes should consider point-of-care CRP-based tuberculosis screening to improve the efficiency of intensified case finding and increase uptake of tuberculosis preventive therapy. National Institutes of Health; President's Emergency Plan for AIDS Relief; University of California, San Francisco, Nina Ireland Program for Lung Health. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hahn, Markus; Barrois, Björn; Krüger, Lars; Wöhler, Christian; Sagerer, Gerhard; Kummert, Franz
2010-09-01
This study introduces an approach to model-based 3D pose estimation and instantaneous motion analysis of the human hand-forearm limb in the application context of safe human-robot interaction. 3D pose estimation is performed using two approaches: The Multiocular Contracting Curve Density (MOCCD) algorithm is a top-down technique based on pixel statistics around a contour model projected into the images from several cameras. The Iterative Closest Point (ICP) algorithm is a bottom-up approach which uses a motion-attributed 3D point cloud to estimate the object pose. Due to their orthogonal properties, a fusion of these algorithms is shown to be favorable. The fusion is performed by a weighted combination of the extracted pose parameters in an iterative manner. The analysis of object motion is based on the pose estimation result and the motion-attributed 3D points belonging to the hand-forearm limb using an extended constraint-line approach which does not rely on any temporal filtering. A further refinement is obtained using the Shape Flow algorithm, a temporal extension of the MOCCD approach, which estimates the temporal pose derivative based on the current and the two preceding images, corresponding to temporal filtering with a short response time of two or at most three frames. Combining the results of the two motion estimation stages provides information about the instantaneous motion properties of the object. Experimental investigations are performed on real-world image sequences displaying several test persons performing different working actions typically occurring in an industrial production scenario. In all example scenes, the background is cluttered, and the test persons wear various kinds of clothes. For evaluation, independently obtained ground truth data are used. [Figure not available: see fulltext.
ERIC Educational Resources Information Center
Koiso, Hanae; Horiuchi, Yasuo; Tutiya, Syun; Ichikawa, Akira; Den, Yasuharu
1998-01-01
Investigates syntactic and prosodic features of speakers' speech at points where turn-taking and backchannels occur, focusing on an analysis of Japanese spontaneous dialogs. The study shows that in both turn-taking and backchannels, some instances of syntactic features make extremely strong contributions, and syntax has a stronger contribution…
Successful ageing: A study of the literature using citation network analysis.
Kusumastuti, Sasmita; Derks, Marloes G M; Tellier, Siri; Di Nucci, Ezio; Lund, Rikke; Mortensen, Erik Lykke; Westendorp, Rudi G J
2016-11-01
Ageing is accompanied by an increased risk of disease and a loss of functioning on several bodily and mental domains and some argue that maintaining health and functioning is essential for a successful old age. Paradoxically, studies have shown that overall wellbeing follows a curvilinear pattern with the lowest point at middle age but increases thereafter up to very old age. To shed further light on this paradox, we reviewed the existing literature on how scholars define successful ageing and how they weigh the contribution of health and functioning to define success. We performed a novel, hypothesis-free and quantitative analysis of citation networks exploring the literature on successful ageing that exists in the Web of Science Core Collection Database using the CitNetExplorer software. Outcomes were visualized using timeline-based citation patterns. The clusters and sub-clusters of citation networks identified were starting points for in-depth qualitative analysis. Within the literature from 1902 through 2015, two distinct citation networks were identified. The first cluster had 1146 publications and 3946 citation links. It focused on successful ageing from the perspective of older persons themselves. Analysis of the various sub-clusters emphasized the importance of coping strategies, psycho-social engagement, and cultural differences. The second cluster had 609 publications and 1682 citation links and viewed successful ageing based on the objective measurements as determined by researchers. Subsequent sub-clustering analysis pointed to different domains of functioning and various ways of assessment. In the current literature two mutually exclusive concepts of successful ageing are circulating that depend on whether the individual himself or an outsider judges the situation. These different points of view help to explain the disability paradox, as successful ageing lies in the eyes of the beholder. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Cheng, Jun; Zhang, Jun; Tian, Jinwen
2015-12-01
Based on deep analysis of the LiveWire interactive boundary extraction algorithm, a new algorithm focusing on improving the speed of LiveWire algorithm is proposed in this paper. Firstly, the Haar wavelet transform is carried on the input image, and the boundary is extracted on the low resolution image obtained by the wavelet transform of the input image. Secondly, calculating LiveWire shortest path is based on the control point set direction search by utilizing the spatial relationship between the two control points users provide in real time. Thirdly, the search order of the adjacent points of the starting node is set in advance. An ordinary queue instead of a priority queue is taken as the storage pool of the points when optimizing their shortest path value, thus reducing the complexity of the algorithm from O[n2] to O[n]. Finally, A region iterative backward projection method based on neighborhood pixel polling has been used to convert dual-pixel boundary of the reconstructed image to single-pixel boundary after Haar wavelet inverse transform. The algorithm proposed in this paper combines the advantage of the Haar wavelet transform and the advantage of the optimal path searching method based on control point set direction search. The former has fast speed of image decomposition and reconstruction and is more consistent with the texture features of the image and the latter can reduce the time complexity of the original algorithm. So that the algorithm can improve the speed in interactive boundary extraction as well as reflect the boundary information of the image more comprehensively. All methods mentioned above have a big role in improving the execution efficiency and the robustness of the algorithm.
NASA Technical Reports Server (NTRS)
Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.
1977-01-01
A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.
Park, Gwansik; Forman, Jason; Kim, Taewung; Panzer, Matthew B; Crandall, Jeff R
2018-02-28
The goal of this study was to explore a framework for developing injury risk functions (IRFs) in a bottom-up approach based on responses of parametrically variable finite element (FE) models representing exemplar populations. First, a parametric femur modeling tool was developed and validated using a subject-specific (SS)-FE modeling approach. Second, principal component analysis and regression were used to identify parametric geometric descriptors of the human femur and the distribution of those factors for 3 target occupant sizes (5th, 50th, and 95th percentile males). Third, distributions of material parameters of cortical bone were obtained from the literature for 3 target occupant ages (25, 50, and 75 years) using regression analysis. A Monte Carlo method was then implemented to generate populations of FE models of the femur for target occupants, using a parametric femur modeling tool. Simulations were conducted with each of these models under 3-point dynamic bending. Finally, model-based IRFs were developed using logistic regression analysis, based on the moment at fracture observed in the FE simulation. In total, 100 femur FE models incorporating the variation in the population of interest were generated, and 500,000 moments at fracture were observed (applying 5,000 ultimate strains for each synthesized 100 femur FE models) for each target occupant characteristics. Using the proposed framework on this study, the model-based IRFs for 3 target male occupant sizes (5th, 50th, and 95th percentiles) and ages (25, 50, and 75 years) were developed. The model-based IRF was located in the 95% confidence interval of the test-based IRF for the range of 15 to 70% injury risks. The 95% confidence interval of the developed IRF was almost in line with the mean curve due to a large number of data points. The framework proposed in this study would be beneficial for developing the IRFs in a bottom-up manner, whose range of variabilities is informed by the population-based FE model responses. Specifically, this method mitigates the uncertainties in applying empirical scaling and may improve IRF fidelity when a limited number of experimental specimens are available.
Saddle point localization of molecular wavefunctions.
Mellau, Georg Ch; Kyuberis, Alexandra A; Polyansky, Oleg L; Zobov, Nikolai; Field, Robert W
2016-09-15
The quantum mechanical description of isomerization is based on bound eigenstates of the molecular potential energy surface. For the near-minimum regions there is a textbook-based relationship between the potential and eigenenergies. Here we show how the saddle point region that connects the two minima is encoded in the eigenstates of the model quartic potential and in the energy levels of the [H, C, N] potential energy surface. We model the spacing of the eigenenergies with the energy dependent classical oscillation frequency decreasing to zero at the saddle point. The eigenstates with the smallest spacing are localized at the saddle point. The analysis of the HCN ↔ HNC isomerization states shows that the eigenstates with small energy spacing relative to the effective (v1, v3, ℓ) bending potentials are highly localized in the bending coordinate at the transition state. These spectroscopically detectable states represent a chemical marker of the transition state in the eigenenergy spectrum. The method developed here provides a basis for modeling characteristic patterns in the eigenenergy spectrum of bound states.
An automated smartphone-based diagnostic assay for point-of-care semen analysis
Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L.; Draz, Mohamed Shehata; Petrozza, John C.; Shafiee, Hadi
2017-01-01
Male infertility affects up to 12% of the world’s male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with <5-s mean processing time and provide the user a semen quality evaluation based on the World Health Organization (WHO) guidelines with ~98% accuracy. The work suggests that the integration of microfluidics, optical sensing accessories, and advances in consumer electronics, particularly smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. PMID:28330865
Performance Analysis and Electronics Packaging of the Optical Communications Demonstrator
NASA Technical Reports Server (NTRS)
Jeganathan, M.; Monacos, S.
1998-01-01
The Optical Communications Demonstrator (OCD), under development at the Jet Propulsion Laboratory (JPL), is a laboratory-based lasercomm terminal designed to validate several key technologies, primarily precision beam pointing, high bandwidth tracking, and beacon acquisition.
A fast image matching algorithm based on key points
NASA Astrophysics Data System (ADS)
Wang, Huilin; Wang, Ying; An, Ru; Yan, Peng
2014-05-01
Image matching is a very important technique in image processing. It has been widely used for object recognition and tracking, image retrieval, three-dimensional vision, change detection, aircraft position estimation, and multi-image registration. Based on the requirements of matching algorithm for craft navigation, such as speed, accuracy and adaptability, a fast key point image matching method is investigated and developed. The main research tasks includes: (1) Developing an improved celerity key point detection approach using self-adapting threshold of Features from Accelerated Segment Test (FAST). A method of calculating self-adapting threshold was introduced for images with different contrast. Hessian matrix was adopted to eliminate insecure edge points in order to obtain key points with higher stability. This approach in detecting key points has characteristics of small amount of computation, high positioning accuracy and strong anti-noise ability; (2) PCA-SIFT is utilized to describe key point. 128 dimensional vector are formed based on the SIFT method for the key points extracted. A low dimensional feature space was established by eigenvectors of all the key points, and each eigenvector was projected onto the feature space to form a low dimensional eigenvector. These key points were re-described by dimension-reduced eigenvectors. After reducing the dimension by the PCA, the descriptor was reduced to 20 dimensions from the original 128. This method can reduce dimensions of searching approximately near neighbors thereby increasing overall speed; (3) Distance ratio between the nearest neighbour and second nearest neighbour searching is regarded as the measurement criterion for initial matching points from which the original point pairs matched are obtained. Based on the analysis of the common methods (e.g. RANSAC (random sample consensus) and Hough transform cluster) used for elimination false matching point pairs, a heuristic local geometric restriction strategy is adopted to discard false matched point pairs further; and (4) Affine transformation model is introduced to correct coordinate difference between real-time image and reference image. This resulted in the matching of the two images. SPOT5 Remote sensing images captured at different date and airborne images captured with different flight attitude were used to test the performance of the method from matching accuracy, operation time and ability to overcome rotation. Results show the effectiveness of the approach.
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds.
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-06-17
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data.
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds†
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-01-01
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data. PMID:27322279
NASA Astrophysics Data System (ADS)
Schröder, Jörg; Viebahn, Nils; Wriggers, Peter; Auricchio, Ferdinando; Steeger, Karl
2017-09-01
In this work we investigate different mixed finite element formulations for the detection of critical loads for the possible occurrence of bifurcation and limit points. In detail, three- and two-field formulations for incompressible and quasi-incompressible materials are analyzed. In order to apply various penalty functions for the volume dilatation in displacement/pressure mixed elements we propose a new consistent scheme capturing the non linearities of the penalty constraints. It is shown that for all mixed formulations, which can be reduced to a generalized displacement scheme, a straight forward stability analysis is possible. However, problems based on the classical saddle-point structure require a different analyses based on the change of the signature of the underlying matrix system. The basis of these investigations is the work from Auricchio et al. (Comput Methods Appl Mech Eng 194:1075-1092, 2005, Comput Mech 52:1153-1167, 2013).
3DNOW: Image-Based 3d Reconstruction and Modeling via Web
NASA Astrophysics Data System (ADS)
Tefera, Y.; Poiesi, F.; Morabito, D.; Remondino, F.; Nocerino, E.; Chippendale, P.
2018-05-01
This paper presents a web-based 3D imaging pipeline, namely 3Dnow, that can be used by anyone without the need of installing additional software other than a browser. By uploading a set of images through the web interface, 3Dnow can generate sparse and dense point clouds as well as mesh models. 3D reconstructed models can be downloaded with standard formats or previewed directly on the web browser through an embedded visualisation interface. In addition to reconstructing objects, 3Dnow offers the possibility to evaluate and georeference point clouds. Reconstruction statistics, such as minimum, maximum and average intersection angles, point redundancy and density can also be accessed. The paper describes all features available in the web service and provides an analysis of the computational performance using servers with different GPU configurations.
Autoregressive-model-based missing value estimation for DNA microarray time series data.
Choong, Miew Keen; Charbit, Maurice; Yan, Hong
2009-01-01
Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.
Finite element Compton tomography
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Amouzou, Pauline; Menon, Naresh; Gertsenshteyn, Michael
2007-09-01
In this paper a new approach to 3D Compton imaging is presented, based on a kind of finite element (FE) analysis. A window for X-ray incoherent scattering (or Compton scattering) attenuation coefficients is identified for breast cancer diagnosis, for hard X-ray photon energy of 100-300 keV. The point-by-point power/energy budget is computed, based on a 2D array of X-ray pencil beams, scanned vertically. The acceptable medical doses are also computed. The proposed finite element tomography (FET) can be an alternative to X-ray mammography, tomography, and tomosynthesis. In experiments, 100 keV (on average) X-ray photons are applied, and a new type of pencil beam collimation, based on a Lobster-Eye Lens (LEL), is proposed.
Point pattern analysis of FIA data
Chris Woodall
2002-01-01
Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...
Managing distance and covariate information with point-based clustering.
Whigham, Peter A; de Graaf, Brandon; Srivastava, Rashmi; Glue, Paul
2016-09-01
Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley's K and applied to the problem of clustering with deliberate self-harm (DSH), is presented. Point-based Monte-Carlo simulation of Ripley's K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years' emergency hospital presentations (n = 136) in a New Zealand town (population ~50,000). Study area was defined by residential (housing) land parcels representing a finite set of possible point addresses. Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley's K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for covariate measures that exhibit spatial clustering, such as deprivation, are crucial when assessing point-based clustering.
Curtis, Jacqueline W
2017-01-01
Census tracts are often used to investigate area-based correlates of a variety of health outcomes. This approach has been shown to be valuable in understanding the ways that health is shaped by place and to design appropriate interventions that account for community-level processes. Following this line of inquiry, it is common in the study of pedestrian injuries to aggregate the point level locations of these injuries to the census tracts in which they occur. Such aggregation enables investigation of the relationships between a range of socioeconomic variables and areas of notably high or low incidence. This study reports on the spatial distribution of child pedestrian injuries in a mid-sized U.S. city over a three-year period. Utilizing a combination of geospatial approaches, Near Analysis, Kernel Density Estimation, and Local Moran's I, enables identification, visualization, and quantification of close proximity between incidents and tract boundaries. Specifically, results reveal that nearly half of the 100 incidents occur within roads that are also census tract boundaries. Results also uncover incidents that occur on tract boundaries, not merely near them. This geographic pattern raises the question of the utility of associating area-based census data from any one tract to the injuries occurring in these border zones. Furthermore, using a standard spatial join technique in a Geographic Information System (GIS), these points located on the border are counted as falling into census tracts on both sides of the boundary, which introduces uncertainty in any subsequent analysis. Therefore, two additional approaches of aggregating points to polygons were tested in this study. Results differ with each approach, but without any alert of such differences to the GIS user. This finding raises a fundamental concern about techniques through which points are aggregated to polygons in any study using point level incidents and their surrounding census tract socioeconomic data to understand health and place. This study concludes with a suggested protocol to test for this source of uncertainty in analysis and an approach that may remove it.
Morphological configuration of the cranial base among children aged 8 to 12 years.
Cossio, Lina; López, Jorge; Rueda, Zulma Vanessa; Botero-Mariaca, Paola
2016-06-14
Cranial base is used as reference structure to determine the skeletal type in cephalometric analysis. The purpose was to assess the cranial base length on lateral cephalic radiographs of children between 8 and 12 and compare these measurements with baseline studies in order to evaluate the relationship between the length and the cranial base angle, articular angle, gonial angle and skeletal type. A Cross-sectional study in 149 children aged 8-12 years, originally from Aburrá Valley, who had lateral cephalic radiographs and consented to participate in this study. The variables studied included: age, sex, sella-nasion, sella-nasion-articular, sella-nasion-basion, articular-gonion-menton, gonion-menton, sella-nasion-point B, sella-nasion-point A y point A-nasion-point B. These variables were digitally measured through i-dixel 2 digital software. One-way ANOVA was used to determine mean values and mean value differences. The values obtained were compared with previous studies. A p value <0.05 was considered significant. Cranial base lengths are smaller in each age and sex group, with differences exceeding 10 mm for measurement, compared both with the study by Riolo (Michigan) and the study carried out in Damasco (Antioquia). No relation was found between the skeletal type and the anterior cranial base length, the sella angle and the cranial base angle. Also, no relation was found between the gonial angle and sella angle or the cranial base angle. The cranial base varies from one population to another. Accordingly, compared to other studies it is shorter for the assessed sample.
Tin-silver-bismuth solders for electronics assembly
Vianco, Paul T.; Rejent, Jerome A.
1995-01-01
A lead-free solder alloy for electronic assemblies composed of a eutectic alloy of tin and silver with a bismuth addition, x, of 0
Tin-silver-bismuth solders for electronics assembly
Vianco, P.T.; Rejent, J.A.
1995-08-08
A lead-free solder alloy is disclosed for electronic assemblies composed of a eutectic alloy of tin and silver with a bismuth addition, x, of 0
Development of spatial scaling technique of forest health sample point information
NASA Astrophysics Data System (ADS)
Lee, J.; Ryu, J.; Choi, Y. Y.; Chung, H. I.; Kim, S. H.; Jeon, S. W.
2017-12-01
Most forest health assessments are limited to monitoring sampling sites. The monitoring of forest health in Britain in Britain was carried out mainly on five species (Norway spruce, Sitka spruce, Scots pine, Oak, Beech) Database construction using Oracle database program with density The Forest Health Assessment in GreatBay in the United States was conducted to identify the characteristics of the ecosystem populations of each area based on the evaluation of forest health by tree species, diameter at breast height, water pipe and density in summer and fall of 200. In the case of Korea, in the first evaluation report on forest health vitality, 1000 sample points were placed in the forests using a systematic method of arranging forests at 4Km × 4Km at regular intervals based on an sample point, and 29 items in four categories such as tree health, vegetation, soil, and atmosphere. As mentioned above, existing researches have been done through the monitoring of the survey sample points, and it is difficult to collect information to support customized policies for the regional survey sites. In the case of special forests such as urban forests and major forests, policy and management appropriate to the forest characteristics are needed. Therefore, it is necessary to expand the survey headquarters for diagnosis and evaluation of customized forest health. For this reason, we have constructed a method of spatial scale through the spatial interpolation according to the characteristics of each index of the main sample point table of 29 index in the four points of diagnosis and evaluation report of the first forest health vitality report, PCA statistical analysis and correlative analysis are conducted to construct the indicators with significance, and then weights are selected for each index, and evaluation of forest health is conducted through statistical grading.
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
ERIC Educational Resources Information Center
Ross, Bertram; And Others
1991-01-01
An investigation of students understandings of acids and bases using concept maps, multiple-choice tests, and clinical interviews is described. The methodology and resulting analysis are illustrated with two abbreviated case studies selected from the study. Discussion of concept mapping points to how it starkly represents gaps in the understanding…
USDA-ARS?s Scientific Manuscript database
The growing incidence of chronic wounds in the world population has prompted increased interest in chronic wound dressings with protease-modulating activity and protease point of care sensors to treat and enable monitoring of elevated protease-based wound pathology. However, the overall design featu...
The Inequivalence of an Online and Classroom Based General Psychology Course
ERIC Educational Resources Information Center
Edmonds, Christopher L.
2006-01-01
One-hundred seventy-five students enrolled in either a traditional classroom lecture section of General Psychology or in an online section of the same course were compared on exam performance. When covariates of high school grade point average and SAT composite scores were entered into the analysis, students enrolled in the classroom based lecture…
Information Theoretic Studies and Assessment of Space Object Identification
2014-03-24
localization are contained in Ref. [5]. 1.7.1 A Bayesian MPE Based Analysis of 2D Point-Source-Pair Superresolution In a second recently submitted paper [6], a...related problem of the optical superresolution (OSR) of a pair of equal-brightness point sources separated spatially by a distance (or angle) smaller...1403.4897 [physics.optics] (19 March 2014). 6. S. Prasad, “Asymptotics of Bayesian error probability and 2D pair superresolution ,” submitted to Opt. Express
China's graduate students need better education in scientific writing and publishing.
Zhang, Chun-Jie; Zhu, Yuan
2016-05-01
Taking as a starting point the analysis of the current status of scientific writing and article publication of China's graduate students, we point out the need for expanded education on these aspects for such new authors. Then, based on the experience of being both an advisor and a journal editor, the author gives advice on how to effectively conduct such education and what roles should be played by a college, an advisor and a journal respectively.
Data Collection with Vehicular-Based Systems - Pole Mountain, WY
2012-09-01
demonstration site (Billings et al., 2010) and a second demonstration at Camp Butner ( Pasion et al., 2012). To date, testing of these approaches has...determine dig-list order using the DigZilla tool ( Pasion et al., 2012). The DigZilla analysis used all three polarizabilities. An initial stop-dig point...last TOI recovered on the very last dig. The production team used the automated method of Pasion et al., (2012) for determining the stop-dig point
Derivative based sensitivity analysis of gamma index
Sarkar, Biplab; Pradhan, Anirudh; Ganesh, T.
2015-01-01
Originally developed as a tool for patient-specific quality assurance in advanced treatment delivery methods to compare between measured and calculated dose distributions, the gamma index (γ) concept was later extended to compare between any two dose distributions. It takes into effect both the dose difference (DD) and distance-to-agreement (DTA) measurements in the comparison. Its strength lies in its capability to give a quantitative value for the analysis, unlike other methods. For every point on the reference curve, if there is at least one point in the evaluated curve that satisfies the pass criteria (e.g., δDD = 1%, δDTA = 1 mm), the point is included in the quantitative score as “pass.” Gamma analysis does not account for the gradient of the evaluated curve - it looks at only the minimum gamma value, and if it is <1, then the point passes, no matter what the gradient of evaluated curve is. In this work, an attempt has been made to present a derivative-based method for the identification of dose gradient. A mathematically derived reference profile (RP) representing the penumbral region of 6 MV 10 cm × 10 cm field was generated from an error function. A general test profile (GTP) was created from this RP by introducing 1 mm distance error and 1% dose error at each point. This was considered as the first of the two evaluated curves. By its nature, this curve is a smooth curve and would satisfy the pass criteria for all points in it. The second evaluated profile was generated as a sawtooth test profile (STTP) which again would satisfy the pass criteria for every point on the RP. However, being a sawtooth curve, it is not a smooth one and would be obviously poor when compared with the smooth profile. Considering the smooth GTP as an acceptable profile when it passed the gamma pass criteria (1% DD and 1 mm DTA) against the RP, the first and second order derivatives of the DDs (δD’, δD”) between these two curves were derived and used as the boundary values for evaluating the STTP against the RP. Even though the STTP passed the simple gamma pass criteria, it was found failing at many locations when the derivatives were used as the boundary values. The proposed derivative-based method can identify a noisy curve and can prove to be a useful tool for improving the sensitivity of the gamma index. PMID:26865761
Approximate analytical solutions in the analysis of thin elastic plates
NASA Astrophysics Data System (ADS)
Goloskokov, Dmitriy P.; Matrosov, Alexander V.
2018-05-01
Two approaches to the construction of approximate analytical solutions for bending of a rectangular thin plate are presented: the superposition method based on the method of initial functions (MIF) and the one built using the Green's function in the form of orthogonal series. Comparison of two approaches is carried out by analyzing a square plate clamped along its contour. Behavior of the moment and the shear force in the neighborhood of the corner points is discussed. It is shown that both solutions give identical results at all points of the plate except for the neighborhoods of the corner points. There are differences in the values of bending moments and generalized shearing forces in the neighborhoods of the corner points.
Jin, Xin; Liu, Li; Chen, Yanqin; Dai, Qionghai
2017-05-01
This paper derives a mathematical point spread function (PSF) and a depth-invariant focal sweep point spread function (FSPSF) for plenoptic camera 2.0. Derivation of PSF is based on the Fresnel diffraction equation and image formation analysis of a self-built imaging system which is divided into two sub-systems to reflect the relay imaging properties of plenoptic camera 2.0. The variations in PSF, which are caused by changes of object's depth and sensor position variation, are analyzed. A mathematical model of FSPSF is further derived, which is verified to be depth-invariant. Experiments on the real imaging systems demonstrate the consistency between the proposed PSF and the actual imaging results.
Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C
2011-04-01
The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF clusters. © Georg Thieme Verlag KG Stuttgart · New York.
An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1972-01-01
Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.
Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III
1996-01-01
Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.
NASA Astrophysics Data System (ADS)
Nassiri, Isar; Lombardo, Rosario; Lauria, Mario; Morine, Melissa J.; Moyseos, Petros; Varma, Vijayalakshmi; Nolen, Greg T.; Knox, Bridgett; Sloper, Daniel; Kaput, Jim; Priami, Corrado
2016-07-01
The investigation of the complex processes involved in cellular differentiation must be based on unbiased, high throughput data processing methods to identify relevant biological pathways. A number of bioinformatics tools are available that can generate lists of pathways ranked by statistical significance (i.e. by p-value), while ideally it would be desirable to functionally score the pathways relative to each other or to other interacting parts of the system or process. We describe a new computational method (Network Activity Score Finder - NASFinder) to identify tissue-specific, omics-determined sub-networks and the connections with their upstream regulator receptors to obtain a systems view of the differentiation of human adipocytes. Adipogenesis of human SBGS pre-adipocyte cells in vitro was monitored with a transcriptomic data set comprising six time points (0, 6, 48, 96, 192, 384 hours). To elucidate the mechanisms of adipogenesis, NASFinder was used to perform time-point analysis by comparing each time point against the control (0 h) and time-lapse analysis by comparing each time point with the previous one. NASFinder identified the coordinated activity of seemingly unrelated processes between each comparison, providing the first systems view of adipogenesis in culture. NASFinder has been implemented into a web-based, freely available resource associated with novel, easy to read visualization of omics data sets and network modules.
A required course in the development, implementation, and evaluation of clinical pharmacy services.
Skomo, Monica L; Kamal, Khalid M; Berdine, Hildegarde J
2008-10-15
To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care.
A Required Course in the Development, Implementation, and Evaluation of Clinical Pharmacy Services
Kamal, Khalid M.; Berdine, Hildegarde J.
2008-01-01
Objective To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Design Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. Assessment One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. Conclusion The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care. PMID:19214263
Spatial Point Pattern Analysis of Neurons Using Ripley's K-Function in 3D
Jafari-Mamaghani, Mehrdad; Andersson, Mikael; Krieger, Patrik
2010-01-01
The aim of this paper is to apply a non-parametric statistical tool, Ripley's K-function, to analyze the 3-dimensional distribution of pyramidal neurons. Ripley's K-function is a widely used tool in spatial point pattern analysis. There are several approaches in 2D domains in which this function is executed and analyzed. Drawing consistent inferences on the underlying 3D point pattern distributions in various applications is of great importance as the acquisition of 3D biological data now poses lesser of a challenge due to technological progress. As of now, most of the applications of Ripley's K-function in 3D domains do not focus on the phenomenon of edge correction, which is discussed thoroughly in this paper. The main goal is to extend the theoretical and practical utilization of Ripley's K-function and corresponding tests based on bootstrap resampling from 2D to 3D domains. PMID:20577588
NASA Technical Reports Server (NTRS)
Hanley, G. M.
1981-01-01
This volume summarizes the basic requirements used as a guide to systems analysis, and is a basis for the selection of candidate Satellite Power Systems (SPS) point designs. Initially, these collected data reflected the level of definition resulting from the evaluation of a broad spectrum of SPS concepts. As the various concepts matured, these requirements were updated to reflect the requirements identified for the projected satellite system/subsystem point designs. Included is an updated version of earlier Rockwell concepts using klystrons as the specific microwave power amplification approach, as well as a more in-depth definition, analysis and preliminary point design on two concepts based on the use of advanced solid state technology to accomplish the task of high power amplification of the 2.45 GHz transmitted power beam to the Earth receiver. Finally, a preliminary definition of a concept using magnetrons as the microwave power amplifiers is presented.
Wardley, C Sonia; Applegate, E Brooks; Almaleki, A Deyab; Van Rhee, James A
2016-03-01
A 6-year longitudinal study was conducted to compare the perceived stress experienced during a 2-year master's physician assistant program by 5 cohorts of students enrolled in either problem-based learning (PBL) or lecture-based learning (LBL) curricular tracks. The association of perceived stress with academic achievement was also assessed. Students rated their stress levels on visual analog scales in relation to family obligations, financial concerns, schoolwork, and relocation and overall on 6 occasions throughout the program. A mixed model analysis of variance examined the students' perceived level of stress by curriculum and over time. Regression analysis further examined school work-related stress after controlling for other stressors and possible lag effect of stress from the previous time point. Students reported that overall stress increased throughout the didactic year followed by a decline in the clinical year with statistically significant curricular (PBL versus LBL) and time differences. PBL students also reported significantly more stress resulting from school work than LBL students at some time points. Moreover, when the other measured stressors and possible lag effects were controlled, significant differences between PBL and LBL students' perceived stress related to school work persisted at the 8- and 12-month measurement points. Increased stress in both curricula was associated with higher achievement in overall and individual organ system examination scores. Physician assistant programs that embrace a PBL pedagogy to prepare students to think clinically may need to provide students with additional support through the didactic curriculum.
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...
Gyöngyösi, Mariann; Wojakowski, Wojciech; Lemarchand, Patricia; Lunde, Ketil; Tendera, Michal; Bartunek, Jozef; Marban, Eduardo; Assmus, Birgit; Henry, Timothy D; Traverse, Jay H; Moyé, Lemuel A; Sürder, Daniel; Corti, Roberto; Huikuri, Heikki; Miettinen, Johanna; Wöhrle, Jochen; Obradovic, Slobodan; Roncalli, Jérome; Malliaras, Konstantinos; Pokushalov, Evgeny; Romanov, Alexander; Kastrup, Jens; Bergmann, Martin W; Atsma, Douwe E; Diederichsen, Axel; Edes, Istvan; Benedek, Imre; Benedek, Theodora; Pejkov, Hristo; Nyolczas, Noemi; Pavo, Noemi; Bergler-Klein, Jutta; Pavo, Imre J; Sylven, Christer; Berti, Sergio; Navarese, Eliano P; Maurer, Gerald
2015-04-10
The meta-Analysis of Cell-based CaRdiac study is the first prospectively declared collaborative multinational database, including individual data of patients with ischemic heart disease treated with cell therapy. We analyzed the safety and efficacy of intracoronary cell therapy after acute myocardial infarction (AMI), including individual patient data from 12 randomized trials (ASTAMI, Aalst, BOOST, BONAMI, CADUCEUS, FINCELL, REGENT, REPAIR-AMI, SCAMI, SWISS-AMI, TIME, LATE-TIME; n=1252). The primary end point was freedom from combined major adverse cardiac and cerebrovascular events (including all-cause death, AMI recurrance, stroke, and target vessel revascularization). The secondary end point was freedom from hard clinical end points (death, AMI recurrence, or stroke), assessed with random-effects meta-analyses and Cox regressions for interactions. Secondary efficacy end points included changes in end-diastolic volume, end-systolic volume, and ejection fraction, analyzed with random-effects meta-analyses and ANCOVA. We reported weighted mean differences between cell therapy and control groups. No effect of cell therapy on major adverse cardiac and cerebrovascular events (14.0% versus 16.3%; hazard ratio, 0.86; 95% confidence interval, 0.63-1.18) or death (1.4% versus 2.1%) or death/AMI recurrence/stroke (2.9% versus 4.7%) was identified in comparison with controls. No changes in ejection fraction (mean difference: 0.96%; 95% confidence interval, -0.2 to 2.1), end-diastolic volume, or systolic volume were observed compared with controls. These results were not influenced by anterior AMI location, reduced baseline ejection fraction, or the use of MRI for assessing left ventricular parameters. This meta-analysis of individual patient data from randomized trials in patients with recent AMI revealed that intracoronary cell therapy provided no benefit, in terms of clinical events or changes in left ventricular function. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01098591. © 2015 American Heart Association, Inc.
Development of Design Analysis Methods for C/SiC Composite Structures
NASA Technical Reports Server (NTRS)
Sullivan, Roy M.; Mital, Subodh K.; Murthy, Pappu L. N.; Palko, Joseph L.; Cueno, Jacques C.; Koenig, John R.
2006-01-01
The stress-strain behavior at room temperature and at 1100 C (2000 F) was measured for two carbon-fiber-reinforced silicon carbide (C/SiC) composite materials: a two-dimensional plain-weave quasi-isotropic laminate and a three-dimensional angle-interlock woven composite. Micromechanics-based material models were developed for predicting the response properties of these two materials. The micromechanics based material models were calibrated by correlating the predicted material property values with the measured values. Four-point beam bending sub-element specimens were fabricated with these two fiber architectures and four-point bending tests were performed at room temperature and at 1100 C. Displacements and strains were measured at various locations along the beam and recorded as a function of load magnitude. The calibrated material models were used in concert with a nonlinear finite element solution to simulate the structural response of these two materials in the four-point beam bending tests. The structural response predicted by the nonlinear analysis method compares favorably with the measured response for both materials and for both test temperatures. Results show that the material models scale up fairly well from coupon to subcomponent level.
On determining the most appropriate test cut-off value: the case of tests with continuous results
Habibzadeh, Parham; Yadollahie, Mahboobeh
2016-01-01
There are several criteria for determination of the most appropriate cut-off value in a diagnostic test with continuous results. Mostly based on receiver operating characteristic (ROC) analysis, there are various methods to determine the test cut-off value. The most common criteria are the point on ROC curve where the sensitivity and specificity of the test are equal; the point on the curve with minimum distance from the left-upper corner of the unit square; and the point where the Youden’s index is maximum. There are also methods mainly based on Bayesian decision analysis. Herein, we show that a proposed method that maximizes the weighted number needed to misdiagnose, an index of diagnostic test effectiveness we previously proposed, is the most appropriate technique compared to the aforementioned ones. For determination of the cut-off value, we need to know the pretest probability of the disease of interest as well as the costs incurred by misdiagnosis. This means that even for a certain diagnostic test, the cut-off value is not universal and should be determined for each region and for each disease condition. PMID:27812299
NASA Technical Reports Server (NTRS)
Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.
1994-01-01
Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.
Sparse electrocardiogram signals recovery based on solving a row echelon-like form of system.
Cai, Pingmei; Wang, Guinan; Yu, Shiwei; Zhang, Hongjuan; Ding, Shuxue; Wu, Zikai
2016-02-01
The study of biology and medicine in a noise environment is an evolving direction in biological data analysis. Among these studies, analysis of electrocardiogram (ECG) signals in a noise environment is a challenging direction in personalized medicine. Due to its periodic characteristic, ECG signal can be roughly regarded as sparse biomedical signals. This study proposes a two-stage recovery algorithm for sparse biomedical signals in time domain. In the first stage, the concentration subspaces are found in advance. Then by exploiting these subspaces, the mixing matrix is estimated accurately. In the second stage, based on the number of active sources at each time point, the time points are divided into different layers. Next, by constructing some transformation matrices, these time points form a row echelon-like system. After that, the sources at each layer can be solved out explicitly by corresponding matrix operations. It is noting that all these operations are conducted under a weak sparse condition that the number of active sources is less than the number of observations. Experimental results show that the proposed method has a better performance for sparse ECG signal recovery problem.
ERIC Educational Resources Information Center
Hofmann, Fabian
2016-01-01
Social phenomenological analysis is presented as a research method to study gallery talks or guided tours in art museums. The research method is based on the philosophical considerations of Edmund Husserl and sociological/social science concepts put forward by Max Weber and Alfred Schuetz. Its starting point is the everyday lifeworld; the…
Distance-based microfluidic quantitative detection methods for point-of-care testing.
Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James
2016-04-07
Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.
A new similarity index for nonlinear signal analysis based on local extrema patterns
NASA Astrophysics Data System (ADS)
Niknazar, Hamid; Motie Nasrabadi, Ali; Shamsollahi, Mohammad Bagher
2018-02-01
Common similarity measures of time domain signals such as cross-correlation and Symbolic Aggregate approximation (SAX) are not appropriate for nonlinear signal analysis. This is because of the high sensitivity of nonlinear systems to initial points. Therefore, a similarity measure for nonlinear signal analysis must be invariant to initial points and quantify the similarity by considering the main dynamics of signals. The statistical behavior of local extrema (SBLE) method was previously proposed to address this problem. The SBLE similarity index uses quantized amplitudes of local extrema to quantify the dynamical similarity of signals by considering patterns of sequential local extrema. By adding time information of local extrema as well as fuzzifying quantized values, this work proposes a new similarity index for nonlinear and long-term signal analysis, which extends the SBLE method. These new features provide more information about signals and reduce noise sensitivity by fuzzifying them. A number of practical tests were performed to demonstrate the ability of the method in nonlinear signal clustering and classification on synthetic data. In addition, epileptic seizure detection based on electroencephalography (EEG) signal processing was done by the proposed similarity to feature the potentials of the method as a real-world application tool.
NASA Technical Reports Server (NTRS)
Stieler, B.
1971-01-01
An inertial navigation system is described and analyzed based on two two-degree-of-freedom Schuler-gyropendulums and one two-degree-of-freedom azimuth gyro. The three sensors, each base motion isolated about its two input axes, are mounted on a common base, strapped down to the vehicle. The up and down pointing spin vectors of the two properly tuned gyropendulums track the vertical and indicate physically their velocity with respect to inertial space. The spin vector of the azimuth gyro is pointing northerly parallel to the earth axis. The system can be made self-aligning on a stationary base. If external measurements for the north direction and the vertical are available, initial disturbance torques can be measured and easily biased out. The error analysis shows that the system is practicable with today's technology.
Semantic focusing allows fully automated single-layer slide scanning of cervical cytology slides.
Lahrmann, Bernd; Valous, Nektarios A; Eisenmann, Urs; Wentzensen, Nicolas; Grabe, Niels
2013-01-01
Liquid-based cytology (LBC) in conjunction with Whole-Slide Imaging (WSI) enables the objective and sensitive and quantitative evaluation of biomarkers in cytology. However, the complex three-dimensional distribution of cells on LBC slides requires manual focusing, long scanning-times, and multi-layer scanning. Here, we present a solution that overcomes these limitations in two steps: first, we make sure that focus points are only set on cells. Secondly, we check the total slide focus quality. From a first analysis we detected that superficial dust can be separated from the cell layer (thin layer of cells on the glass slide) itself. Then we analyzed 2,295 individual focus points from 51 LBC slides stained for p16 and Ki67. Using the number of edges in a focus point image, specific color values and size-inclusion filters, focus points detecting cells could be distinguished from focus points on artifacts (accuracy 98.6%). Sharpness as total focus quality of a virtual LBC slide is computed from 5 sharpness features. We trained a multi-parameter SVM classifier on 1,600 images. On an independent validation set of 3,232 cell images we achieved an accuracy of 94.8% for classifying images as focused. Our results show that single-layer scanning of LBC slides is possible and how it can be achieved. We assembled focus point analysis and sharpness classification into a fully automatic, iterative workflow, free of user intervention, which performs repetitive slide scanning as necessary. On 400 LBC slides we achieved a scanning-time of 13.9±10.1 min with 29.1±15.5 focus points. In summary, the integration of semantic focus information into whole-slide imaging allows automatic high-quality imaging of LBC slides and subsequent biomarker analysis.
Lee, Sang-Hee; Lee, Minho; Kim, Hee-Jin
2014-10-01
We aimed to elucidate the tortuous course of the perioral artery with the aid of image processing, and to suggest accurate reference points for minimally invasive surgery. We used 59 hemifaces from 19 Korean and 20 Thai cadavers. A perioral line was defined to connect the point at which the facial artery emerged on the mandibular margin, and the ramification point of the lateral nasal artery and the inferior alar branch. The course of the perioral artery was reproduced as a graph based on the perioral line and analysed by adding the image of the artery using MATLAB. The course of the artery could be classified into 2 according to the course of the alar branch - oblique and vertical. Two distinct inflection points appeared in the course of the artery along the perioral line at the ramification points of the alar branch and the inferior labial artery, respectively, and the course of the artery across the face can be predicted based on the following references: the perioral line, the ramification point of the alar branch (5∼10 mm medial to the perioral line at the level of the lower third of the upper lip) and the inferior labial artery (5∼10 mm medial to the perioral line at the level of the middle of the lower lip). Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Zaylaa, Amira; Charara, Jamal; Girault, Jean-Marc
2015-08-01
The analysis of biomedical signals demonstrating complexity through recurrence plots is challenging. Quantification of recurrences is often biased by sojourn points that hide dynamic transitions. To overcome this problem, time series have previously been embedded at high dimensions. However, no one has quantified the elimination of sojourn points and rate of detection, nor the enhancement of transition detection has been investigated. This paper reports our on-going efforts to improve the detection of dynamic transitions from logistic maps and fetal hearts by reducing sojourn points. Three signal-based recurrence plots were developed, i.e. embedded with specific settings, derivative-based and m-time pattern. Determinism, cross-determinism and percentage of reduced sojourn points were computed to detect transitions. For logistic maps, an increase of 50% and 34.3% in sensitivity of detection over alternatives was achieved by m-time pattern and embedded recurrence plots with specific settings, respectively, and with a 100% specificity. For fetal heart rates, embedded recurrence plots with specific settings provided the best performance, followed by derivative-based recurrence plot, then unembedded recurrence plot using the determinism parameter. The relative errors between healthy and distressed fetuses were 153%, 95% and 91%. More than 50% of sojourn points were eliminated, allowing better detection of heart transitions triggered by gaseous exchange factors. This could be significant in improving the diagnosis of fetal state. Copyright © 2014 Elsevier Ltd. All rights reserved.
Thombare, Ram
2013-01-01
PURPOSE The purpose of this study was to decide the most appropriate point on tragus to be used as a reference point at time of marking ala tragus line while establishing occlusal plane. MATERIALS AND METHODS The data was collected in two groups of subjects: 1) Dentulous 2) Edentulous group having sample size of 30 for each group with equal gender distribution (15 males, 15 females each). Downs analysis was used for base value. Lateral cephalographs were taken for all selected subjects. Three points were marked on tragus as Superior (S), Middle (M), and Inferior (I) and were joined with ala (A) of the nose to form ala-tragus lines. The angle formed by each line (SA plane, MA plane, IA plane) with Frankfort Horizontal (FH) plane was measured by using custom made device and modified protractor in all dentulous and edentulous subjects. Also, in dentulous subjects angle between Frankfort Horizontal plane and natural occlusal plane was measured. The measurements obtained were subjected to the following statistical tests; descriptive analysis, Student's unpaired t-test and Pearson's correlation coefficient. RESULTS The results demonstrated, the mean angle COO (cant of occlusal plane) as 9.76°, inferior point on tragus had given the mean angular value of IFH [Angle between IA plane (plane formed by joining inferior point-I on tragus and ala of nose- A) and FH plane) as 10.40° and 10.56° in dentulous and edentulous subjects respectively which was the closest value to the angle COO and was comparable with the values of angle COO value in Downs analysis. Angulations of ala-tragus line marked from inferior point with occlusal plane in dentulous subject had given the smallest value 2.46° which showed that this ala-tragus line was nearly parallel to occlusal plane. CONCLUSION The inferior point marked on tragus is the most appropriate point for marking ala-tragus line. PMID:23508068
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions
Pan, Guang; Ye, Pengcheng; Yang, Zhidong
2014-01-01
Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
Point Analysis in Java applied to histological images of the perforant pathway: a user's account.
Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán
2008-01-01
The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.
Inference from clustering with application to gene-expression microarrays.
Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M
2002-01-01
There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.
Full On-Device Stay Points Detection in Smartphones for Location-Based Mobile Applications.
Pérez-Torres, Rafael; Torres-Huitzil, César; Galeana-Zapién, Hiram
2016-10-13
The tracking of frequently visited places, also known as stay points, is a critical feature in location-aware mobile applications as a way to adapt the information and services provided to smartphones users according to their moving patterns. Location based applications usually employ the GPS receiver along with Wi-Fi hot-spots and cellular cell tower mechanisms for estimating user location. Typically, fine-grained GPS location data are collected by the smartphone and transferred to dedicated servers for trajectory analysis and stay points detection. Such Mobile Cloud Computing approach has been successfully employed for extending smartphone's battery lifetime by exchanging computation costs, assuming that on-device stay points detection is prohibitive. In this article, we propose and validate the feasibility of having an alternative event-driven mechanism for stay points detection that is executed fully on-device, and that provides higher energy savings by avoiding communication costs. Our solution is encapsulated in a sensing middleware for Android smartphones, where a stream of GPS location updates is collected in the background, supporting duty cycling schemes, and incrementally analyzed following an event-driven paradigm for stay points detection. To evaluate the performance of the proposed middleware, real world experiments were conducted under different stress levels, validating its power efficiency when compared against a Mobile Cloud Computing oriented solution.
NASA Astrophysics Data System (ADS)
Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong
2016-06-01
With the rapid developments of the sensor technology, high spatial resolution imagery and airborne Lidar point clouds can be captured nowadays, which make classification, extraction, evaluation and analysis of a broad range of object features available. High resolution imagery, Lidar dataset and parcel map can be widely used for classification as information carriers. Therefore, refinement of objects classification is made possible for the urban land cover. The paper presents an approach to object based image analysis (OBIA) combing high spatial resolution imagery and airborne Lidar point clouds. The advanced workflow for urban land cover is designed with four components. Firstly, colour-infrared TrueOrtho photo and laser point clouds were pre-processed to derive the parcel map of water bodies and nDSM respectively. Secondly, image objects are created via multi-resolution image segmentation integrating scale parameter, the colour and shape properties with compactness criterion. Image can be subdivided into separate object regions. Thirdly, image objects classification is performed on the basis of segmentation and a rule set of knowledge decision tree. These objects imagery are classified into six classes such as water bodies, low vegetation/grass, tree, low building, high building and road. Finally, in order to assess the validity of the classification results for six classes, accuracy assessment is performed through comparing randomly distributed reference points of TrueOrtho imagery with the classification results, forming the confusion matrix and calculating overall accuracy and Kappa coefficient. The study area focuses on test site Vaihingen/Enz and a patch of test datasets comes from the benchmark of ISPRS WG III/4 test project. The classification results show higher overall accuracy for most types of urban land cover. Overall accuracy is 89.5% and Kappa coefficient equals to 0.865. The OBIA approach provides an effective and convenient way to combine high resolution imagery and Lidar ancillary data for classification of urban land cover.
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...
40 CFR 63.526 - Monitoring requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...
7 CFR 210.13 - Facilities management.
Code of Federal Regulations, 2013 CFR
2013-01-01
... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...
7 CFR 210.13 - Facilities management.
Code of Federal Regulations, 2011 CFR
2011-01-01
... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...
7 CFR 210.13 - Facilities management.
Code of Federal Regulations, 2012 CFR
2012-01-01
... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...
7 CFR 210.13 - Facilities management.
Code of Federal Regulations, 2014 CFR
2014-01-01
... authority with a food safety program based on traditional hazard analysis and critical control point (HACCP... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NATIONAL SCHOOL LUNCH PROGRAM Requirements for School Food Authority...
Choi, Youngshim; Hur, Cheol-Goo; Park, Taesun
2013-01-01
The pathophysiological mechanisms underlying the development of obesity and metabolic diseases are not well understood. To gain more insight into the genetic mediators associated with the onset and progression of diet-induced obesity and metabolic diseases, we studied the molecular changes in response to a high-fat diet (HFD) by using a mode-of-action by network identification (MNI) analysis. Oligo DNA microarray analysis was performed on visceral and subcutaneous adipose tissues and muscles of male C57BL/6N mice fed a normal diet or HFD for 2, 4, 8, and 12 weeks. Each of these data was queried against the MNI algorithm, and the lists of top 5 highly ranked genes and gene ontology (GO)-annotated pathways that were significantly overrepresented among the 100 highest ranked genes at each time point in the 3 different tissues of mice fed the HFD were considered in the present study. The 40 highest ranked genes identified by MNI analysis at each time point in the different tissues of mice with diet-induced obesity were subjected to clustering based on their temporal patterns. On the basis of the above-mentioned results, we investigated the sequential induction of distinct olfactory receptors and the stimulation of cancer-related genes during the development of obesity in both adipose tissues and muscles. The top 5 genes recognized using the MNI analysis at each time point and gene cluster identified based on their temporal patterns in the peripheral tissues of mice provided novel and often surprising insights into the potential genetic mediators for obesity progression.
Weissman-Miller, Deborah
2013-11-02
Point estimation is particularly important in predicting weight loss in individuals or small groups. In this analysis, a new health response function is based on a model of human response over time to estimate long-term health outcomes from a change point in short-term linear regression. This important estimation capability is addressed for small groups and single-subject designs in pilot studies for clinical trials, medical and therapeutic clinical practice. These estimations are based on a change point given by parameters derived from short-term participant data in ordinary least squares (OLS) regression. The development of the change point in initial OLS data and the point estimations are given in a new semiparametric ratio estimator (SPRE) model. The new response function is taken as a ratio of two-parameter Weibull distributions times a prior outcome value that steps estimated outcomes forward in time, where the shape and scale parameters are estimated at the change point. The Weibull distributions used in this ratio are derived from a Kelvin model in mechanics taken here to represent human beings. A distinct feature of the SPRE model in this article is that initial treatment response for a small group or a single subject is reflected in long-term response to treatment. This model is applied to weight loss in obesity in a secondary analysis of data from a classic weight loss study, which has been selected due to the dramatic increase in obesity in the United States over the past 20 years. A very small relative error of estimated to test data is shown for obesity treatment with the weight loss medication phentermine or placebo for the test dataset. An application of SPRE in clinical medicine or occupational therapy is to estimate long-term weight loss for a single subject or a small group near the beginning of treatment.
Artificial equilibrium points in binary asteroid systems with continuous low-thrust
NASA Astrophysics Data System (ADS)
Bu, Shichao; Li, Shuang; Yang, Hongwei
2017-08-01
The positions and dynamical characteristics of artificial equilibrium points (AEPs) in the vicinity of a binary asteroid with continuous low-thrust are studied. The restricted ellipsoid-ellipsoid model of binary system is employed for the binary asteroid system. The positions of AEPs are obtained by this model. It is found that the set of the point L1 or L2 forms a shape of an ellipsoid while the set of the point L3 forms a shape like a "banana". The effect of the continuous low-thrust on the feasible region of motion is analyzed by zero velocity curves. Because of using the low-thrust, the unreachable region can become reachable. The linearized equations of motion are derived for stability's analysis. Based on the characteristic equation of the linearized equations, the stability conditions are derived. The stable regions of AEPs are investigated by a parametric analysis. The effect of the mass ratio and ellipsoid parameters on stable region is also discussed. The results show that the influence of the mass ratio on the stable regions is more significant than the parameters of ellipsoid.
Access to Mars from Earth-Moon Libration Point Orbits:. [Manifold and Direct Options
NASA Technical Reports Server (NTRS)
Kakoi, Masaki; Howell, Kathleen C.; Folta, David
2014-01-01
This investigation is focused specifically on transfers from Earth-Moon L(sub 1)/L(sub 2) libration point orbits to Mars. Initially, the analysis is based in the circular restricted three-body problem to utilize the framework of the invariant manifolds. Various departure scenarios are compared, including arcs that leverage manifolds associated with the Sun-Earth L(sub 2) orbits as well as non-manifold trajectories. For the manifold options, ballistic transfers from Earth-Moon L(sub 2) libration point orbits to Sun-Earth L(sub 1)/L(sub 2) halo orbits are first computed. This autonomous procedure applies to both departure and arrival between the Earth-Moon and Sun-Earth systems. Departure times in the lunar cycle, amplitudes and types of libration point orbits, manifold selection, and the orientation/location of the surface of section all contribute to produce a variety of options. As the destination planet, the ephemeris position for Mars is employed throughout the analysis. The complete transfer is transitioned to the ephemeris model after the initial design phase. Results for multiple departure/arrival scenarios are compared.
Random Walk Quantum Clustering Algorithm Based on Space
NASA Astrophysics Data System (ADS)
Xiao, Shufen; Dong, Yumin; Ma, Hongyang
2018-01-01
In the random quantum walk, which is a quantum simulation of the classical walk, data points interacted when selecting the appropriate walk strategy by taking advantage of quantum-entanglement features; thus, the results obtained when the quantum walk is used are different from those when the classical walk is adopted. A new quantum walk clustering algorithm based on space is proposed by applying the quantum walk to clustering analysis. In this algorithm, data points are viewed as walking participants, and similar data points are clustered using the walk function in the pay-off matrix according to a certain rule. The walk process is simplified by implementing a space-combining rule. The proposed algorithm is validated by a simulation test and is proved superior to existing clustering algorithms, namely, Kmeans, PCA + Kmeans, and LDA-Km. The effects of some of the parameters in the proposed algorithm on its performance are also analyzed and discussed. Specific suggestions are provided.
Critical points of the O(n) loop model on the martini and the 3-12 lattices
NASA Astrophysics Data System (ADS)
Ding, Chengxiang; Fu, Zhe; Guo, Wenan
2012-06-01
We derive the critical line of the O(n) loop model on the martini lattice as a function of the loop weight n basing on the critical points on the honeycomb lattice conjectured by Nienhuis [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.49.1062 49, 1062 (1982)]. In the limit n→0 we prove the connective constant μ=1.7505645579⋯ of self-avoiding walks on the martini lattice. A finite-size scaling analysis based on transfer matrix calculations is also performed. The numerical results coincide with the theoretical predictions with a very high accuracy. Using similar numerical methods, we also study the O(n) loop model on the 3-12 lattice. We obtain similarly precise agreement with the critical points given by Batchelor [J. Stat. Phys.JSTPBS0022-471510.1023/A:1023065215233 92, 1203 (1998)].
Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Sha, D.; Han, X.
2017-10-01
Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.
Research on fully distributed optical fiber sensing security system localization algorithm
NASA Astrophysics Data System (ADS)
Wu, Xu; Hou, Jiacheng; Liu, Kun; Liu, Tiegen
2013-12-01
A new fully distributed optical fiber sensing and location technology based on the Mach-Zehnder interferometers is studied. In this security system, a new climbing point locating algorithm based on short-time average zero-crossing rate is presented. By calculating the zero-crossing rates of the multiple grouped data separately, it not only utilizes the advantages of the frequency analysis method to determine the most effective data group more accurately, but also meets the requirement of the real-time monitoring system. Supplemented with short-term energy calculation group signal, the most effective data group can be quickly picked out. Finally, the accurate location of the climbing point can be effectively achieved through the cross-correlation localization algorithm. The experimental results show that the proposed algorithm can realize the accurate location of the climbing point and meanwhile the outside interference noise of the non-climbing behavior can be effectively filtered out.
NASA Astrophysics Data System (ADS)
Revunova, Svetlana; Vlasenko, Vyacheslav; Bukreev, Anatoly
2017-10-01
The article proposes the models of innovative activity development, which is driven by the formation of “points of innovation-driven growth”. The models are based on the analysis of the current state and dynamics of innovative development of construction enterprises in the transport sector and take into account a number of essential organizational and economic changes in management. The authors substantiate implementing such development models as an organizational innovation that has a communication genesis. The use of the communication approach to the formation of “points of innovation-driven growth” allowed the authors to apply the mathematical tools of the graph theory in order to activate the innovative activity of the transport industry in the region. As a result, the authors have proposed models that allow constructing an optimal mechanism for the formation of “points of innovation-driven growth”.
Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro
2015-07-28
In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.
Schmidt, T B; Schilling, M W; Behrends, J M; Battula, V; Jackson, V; Sekhon, R K; Lawrence, T E
2010-01-01
Consumer research was conducted to evaluate the acceptability of choice and select steaks from the Longissimus lumborum that were cooked to varying degrees of doneness using demographic information, cluster analysis and descriptive analysis. On average, using data from approximately 155 panelists, no differences (P>0.05) existed in consumer acceptability among select and choice steaks, and all treatment means ranged between like slightly and like moderately (6-7) on the hedonic scale. Individual consumers were highly variable in their perception of acceptability and consumers were grouped into clusters (eight for select and seven for choice) based on their preference and liking of steaks. The largest consumer groups liked steaks from all treatments, but other groups preferred (P<0.05) steaks that were cooked to various end-point temperatures. Results revealed that consumers could be grouped together according to preference, liking and descriptive sensory attributes, (juiciness, tenderness, bloody, metallic, and roasted) to further understand consumer perception of steaks that were cooked to different end-point temperatures.
Structural Analysis of Women’s Heptathlon
Gassmann, Freya; Fröhlich, Michael; Emrich, Eike
2016-01-01
The heptathlon comprises the results of seven single disciplines, assuming an equal influence from each discipline, depending on the measured performance. Data analysis was based on the data recorded for the individual performances of the 10 winning heptathletes in the World Athletics Championships from 1987 to 2013 and the Olympic Games from 1988 to 2012. In addition to descriptive analysis methods, correlations, bivariate and multivariate linear regressions, and panel data regressions were used. The transformation of the performances from seconds, centimeters, and meters into points showed that the individual disciplines do not equally affect the overall competition result. The currently valid conversion formula for the run, jump, and throw disciplines prefers the sprint and jump disciplines but penalizes the athletes performing in the 800 m run, javelin throw, and shotput disciplines. Furthermore, 21% to 48% of the variance of the sum of points can be attributed to the performances in the disciplines of long jump, 200 m sprint, 100 m hurdles, and high jump. To balance the effects of the single disciplines in the heptathlon, the formula to calculate points should be reevaluated. PMID:29910260
Statistical dependency in visual scanning
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Stark, Lawrence
1986-01-01
A method to identify statistical dependencies in the positions of eye fixations is developed and applied to eye movement data from subjects who viewed dynamic displays of air traffic and judged future relative position of aircraft. Analysis of approximately 23,000 fixations on points of interest on the display identified statistical dependencies in scanning that were independent of the physical placement of the points of interest. Identification of these dependencies is inconsistent with random-sampling-based theories used to model visual search and information seeking.
NASA Technical Reports Server (NTRS)
Page, Lance; Shen, C. N.
1991-01-01
This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.
Two-Point Microrheology of Phase-Separated Domains in Lipid Bilayers
Hormel, Tristan T.; Reyer, Matthew A.; Parthasarathy, Raghuveer
2015-01-01
Though the importance of membrane fluidity for cellular function has been well established for decades, methods for measuring lipid bilayer viscosity remain challenging to devise and implement. Recently, approaches based on characterizing the Brownian dynamics of individual tracers such as colloidal particles or lipid domains have provided insights into bilayer viscosity. For fluids in general, however, methods based on single-particle trajectories provide a limited view of hydrodynamic response. The technique of two-point microrheology, in which correlations between the Brownian dynamics of pairs of tracers report on the properties of the intervening medium, characterizes viscosity at length-scales that are larger than that of individual tracers and has less sensitivity to tracer-induced distortions, but has never been applied to lipid membranes. We present, to our knowledge, the first two-point microrheological study of lipid bilayers, examining the correlated motion of domains in phase-separated lipid vesicles and comparing one- and two-point results. We measure two-point correlation functions in excellent agreement with the forms predicted by two-dimensional hydrodynamic models, analysis of which reveals a viscosity intermediate between those of the two lipid phases, indicative of global fluid properties rather than the viscosity of the local neighborhood of the tracer. PMID:26287625
NASA Astrophysics Data System (ADS)
Chu, Zhongyi; Di, Jingnan; Cui, Jing
2017-10-01
Space debris occupies a valuable orbital resource and is an inevitable and urgent problem, especially for large space debris because of its high risk and the possible crippling effects of a collision. Space debris has attracted much attention in recent years. A tethered system used in an active debris removal scenario is a promising method to de-orbit large debris in a safe manner. In a tethered system, the flexibility of the tether used in debris removal can possibly induce tangling, which is dangerous and should be avoided. In particular, attachment point bias due to capture error can significantly affect the motion of debris relative to the tether and increase the tangling risk. Hence, in this paper, the effect of attachment point bias on the tethered system is studied based on a dynamic model established based on a Newtonian approach. Next, a safety metric of avoiding a tangle when a tether is tensioned with attachment point bias is designed to analyse the tangling risk of the tethered system. Finally, several numerical cases are established and simulated to validate the effects of attachment point bias on a space tethered system.
NASA Astrophysics Data System (ADS)
Guy, N.; Seyedi, D. M.; Hild, F.
2018-06-01
The work presented herein aims at characterizing and modeling fracturing (i.e., initiation and propagation of cracks) in a clay-rich rock. The analysis is based on two experimental campaigns. The first one relies on a probabilistic analysis of crack initiation considering Brazilian and three-point flexural tests. The second one involves digital image correlation to characterize crack propagation. A nonlocal damage model based on stress regularization is used for the simulations. Two thresholds both based on regularized stress fields are considered. They are determined from the experimental campaigns performed on Lower Watrous rock. The results obtained with the proposed approach are favorably compared with the experimental results.
Murayama, Kodai; Ishikawa, Daitaro; Genkawa, Takuma; Sugino, Hiroyuki; Komiyama, Makoto; Ozaki, Yukihiro
2015-03-03
In the present study we have developed a new version (ND-NIRs) of a polychromator-type near-infrared (NIR) spectrometer with a high-resolution photo diode array detector, which we built before (D-NIRs). The new version has four 5 W halogen lamps compared with the three lamps for the older version. The new version also has a condenser lens with a shorter focal point length. The increase in the number of the lamps and the shortening of the focal point of the condenser lens realize high signal-to-noise ratio and high-speed NIR imaging measurement. By using the ND-NIRs we carried out the in-line monitoring of pharmaceutical blending and determined an end point of the blending process. Moreover, to determinate a more accurate end point, a NIR image of the blending sample was acquired by means of a portable NIR imaging device based on ND-NIRs. The imaging result has demonstrated that the mixing time of 8 min is enough for homogeneous mixing. In this way the present study has demonstrated that ND-NIRs and the imaging system based on a ND-NIRs hold considerable promise for process analysis.
NASA Astrophysics Data System (ADS)
Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.
2016-12-01
Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.
ERIC Educational Resources Information Center
Stinson, Wendy Bounds; Carr, Deborah; Nettles, Mary Frances; Johnson, James T.
2011-01-01
Purpose/Objectives: The objectives of this study were to assess the extent to which school nutrition (SN) programs have implemented food safety programs based on Hazard Analysis and Critical Control Point (HACCP) principles, as well as factors, barriers, and practices related to implementation of these programs. Methods: An online survey was…
An Assessment of the Quality of Life in the European Union Based on the Social Indicators Approach
ERIC Educational Resources Information Center
Grasso, Marco; Canova, Luciano
2008-01-01
This article carries out a multidimensional analysis of welfare based on the social indicators approach aimed at assessing the quality of life in the 25 member countries of the European Union. It begins with description of the social indicators approach and provides some specifications on its most controversial points. It then specifies the…
Jothi, R; Mohanty, Sraban Kumar; Ojha, Aparajita
2016-04-01
Gene expression data clustering is an important biological process in DNA microarray analysis. Although there have been many clustering algorithms for gene expression analysis, finding a suitable and effective clustering algorithm is always a challenging problem due to the heterogeneous nature of gene profiles. Minimum Spanning Tree (MST) based clustering algorithms have been successfully employed to detect clusters of varying shapes and sizes. This paper proposes a novel clustering algorithm using Eigenanalysis on Minimum Spanning Tree based neighborhood graph (E-MST). As MST of a set of points reflects the similarity of the points with their neighborhood, the proposed algorithm employs a similarity graph obtained from k(') rounds of MST (k(')-MST neighborhood graph). By studying the spectral properties of the similarity matrix obtained from k(')-MST graph, the proposed algorithm achieves improved clustering results. We demonstrate the efficacy of the proposed algorithm on 12 gene expression datasets. Experimental results show that the proposed algorithm performs better than the standard clustering algorithms. Copyright © 2016 Elsevier Ltd. All rights reserved.
Egger, Jan; Kappus, Christoph; Freisleben, Bernd; Nimsky, Christopher
2012-08-01
In this contribution, a medical software system for volumetric analysis of different cerebral pathologies in magnetic resonance imaging (MRI) data is presented. The software system is based on a semi-automatic segmentation algorithm and helps to overcome the time-consuming process of volume determination during monitoring of a patient. After imaging, the parameter settings-including a seed point-are set up in the system and an automatic segmentation is performed by a novel graph-based approach. Manually reviewing the result leads to reseeding, adding seed points or an automatic surface mesh generation. The mesh is saved for monitoring the patient and for comparisons with follow-up scans. Based on the mesh, the system performs a voxelization and volume calculation, which leads to diagnosis and therefore further treatment decisions. The overall system has been tested with different cerebral pathologies-glioblastoma multiforme, pituitary adenomas and cerebral aneurysms- and evaluated against manual expert segmentations using the Dice Similarity Coefficient (DSC). Additionally, intra-physician segmentations have been performed to provide a quality measure for the presented system.
Varzakas, Theodoros H; Arvanitoyannis, Ioannis S
2007-01-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA.
Miniature near-infrared spectrometer for point-of-use chemical analysis
NASA Astrophysics Data System (ADS)
Friedrich, Donald M.; Hulse, Charles A.; von Gunten, Marc; Williamson, Eric P.; Pederson, Christopher G.; O'Brien, Nada A.
2014-03-01
Point-of-use chemical analysis holds tremendous promise for a number of industries, including agriculture, recycling, pharmaceuticals and homeland security. Near infrared (NIR) spectroscopy is an excellent candidate for these applications, with minimal sample preparation for real-time decision-making. We will detail the development of a golf ball-sized NIR spectrometer developed specifically for this purpose. The instrument is based upon a thin-film dispersive element that is very stable over time and temperature, with less than 2 nm change expected over the operating temperature range and lifetime of the instrument. This filter is coupled with an uncooled InGaAs detector array in a small, rugged, environmentally stable optical bench ideally suited to unpredictable environments. The resulting instrument weighs less than 60 grams, includes onboard illumination and collection optics for diffuse reflectance applications in the 900-1700 nm wavelength range, and is USB-powered. It can be driven in the field by a laptop, tablet or even a smartphone. The software design includes the potential for both on-board and cloud-based storage, analysis and decision-making. The key attributes of the instrument and the underlying design tradeoffs will be discussed, focusing on miniaturization, ruggedization, power consumption and cost. The optical performance of the instrument, as well as its fit-for purpose will be detailed. Finally, we will show that our manufacturing process has enabled us to build instruments with excellent unit-to-unit reproducibility. We will show that this is a key enabler for instrumentindependent chemical analysis models, a requirement for mass point-of-use deployment.
NASA Astrophysics Data System (ADS)
Najafi, M. N.
2018-04-01
The coupling of the c = ‑2, c=\\frac{1}{2} and c = 0 conformal field theories are numerically considered in this paper. As the prototypes of the couplings, (c_1=-2)\\oplus (c_2=0) and (c_1=-2)\\oplus (c_2=\\frac{1}{2}) , we consider the Bak–Tang–Weisenfeld (BTW) model on the 2D square critical site-percolation and the BTW model on Ising-correlated percolation lattices respectively. Some geometrical techniques are used to characterize the presumable conformal symmetry of the resultant systems. Based on the numerical analysis of the diffusivity parameter (κ) in the Schramm–Loewner evolution (SLE) theory we propose that the algebra of the central charges of the coupled models is closed. This result is based on the analysis of the conformal loop ensemble (CLE) analysis. The diffusivity parameter in each case is obtained by calculating the fractal dimension of loops (and the corresponding exponent of mean-square root distance), the direct SLE mapping method, the left passage probability and the winding angle analysis. More precisely we numerically show that the coupling (c_1=-2)\\oplus (c_2=\\frac{1}{2}) results to 2D self-avoiding walk (SAW) fixed point corresponding to c = 0 conformal field theory, whereas the coupling (c_1=-2)\\oplus (c_2=0) results to the 2D critical Ising fixed point corresponding to the c=\\frac{1}{2} conformal field theory.
Geolocation and Pointing Accuracy Analysis for the WindSat Sensor
NASA Technical Reports Server (NTRS)
Meissner, Thomas; Wentz, Frank J.; Purdy, William E.; Gaiser, Peter W.; Poe, Gene; Uliana, Enzo A.
2006-01-01
Geolocation and pointing accuracy analyses of the WindSat flight data are presented. The two topics were intertwined in the flight data analysis and will be addressed together. WindSat has no unusual geolocation requirements relative to other sensors, but its beam pointing knowledge accuracy is especially critical to support accurate polarimetric radiometry. Pointing accuracy was improved and verified using geolocation analysis in conjunction with scan bias analysis. nvo methods were needed to properly identify and differentiate between data time tagging and pointing knowledge errors. Matchups comparing coastlines indicated in imagery data with their known geographic locations were used to identify geolocation errors. These coastline matchups showed possible pointing errors with ambiguities as to the true source of the errors. Scan bias analysis of U, the third Stokes parameter, and of vertical and horizontal polarizations provided measurement of pointing offsets resolving ambiguities in the coastline matchup analysis. Several geolocation and pointing bias sources were incfementally eliminated resulting in pointing knowledge and geolocation accuracy that met all design requirements.
Lungu, Claudiu N; Diudea, Mircea V
2018-01-01
Lipid II, a peptidoglycan, is a precursor in bacterial cell synthesis. It has both hydrophilic and lipophilic properties. The molecule translocates a bacterial membrane to deliver and incorporate "building blocks" from disaccharide-pentapeptide into the peptidoglican wall. Lipid II is a valid antibiotic target. A receptor binding pocket may be occupied by a ligand in various plausible conformations, among which only few ones are energetically related to a biological activity in the physiological efficiency domain. This paper reports the mapping of the conformational space of Lipid II in its interaction with Teixobactin and other Lipid II ligands. In order to study computationally the complex between Lipid II and ligands, a docking study was first carried on. Docking site was retrieved form literature. After docking, 5 ligand conformations and further 5 complexes (denoted 00 to 04) for each molecule were taken into account. For each structure, conformational studies were performed. Statistical analysis, conformational analysis and molecular dynamics based clustering were used to predict the potency of these compounds. A score for potency prediction was developed. Appling lipid II classification according to Lipid II conformational energy, a conformation of Teixobactin proved to be energetically favorable, followed by Oritravicin, Dalbavycin, Telvanicin, Teicoplamin and Vancomycin, respectively. Scoring of molecules according to cluster band and PCA produced the same result. Molecules classified according to standard deviations showed Dalbavycin as the most favorable conformation, followed by Teicoplamin, Telvanicin, Teixobactin, Oritravicin and Vancomycin, respectively. Total score showing best energetic efficiency of complex formation shows Teixobactin to have the best conformation (a score of 15 points) followed by Dalbavycin (14 points), Oritravicin (12v points), Telvanicin (10 points), Teicoplamin (9 points), Vancomycin (3 points). Statistical analysis of conformations can be used to predict the efficiency of ligand - target interaction and consecutively to find insight regarding ligand potency and postulate about favorable conformation of ligand and binding site. In this study it was shown that Teixobactin is more efficient in binding with Lipid II compared to Vancomycin, results confirmed by experimental data reported in literature. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Brovarets', Ol'ha O; Hovorun, Dmytro M
2014-01-01
The ground-state tautomerization of the G·C Watson-Crick base pair by the double proton transfer (DPT) was comprehensively studied in vacuo and in the continuum with a low dielectric constant (ϵ = 4), corresponding to a hydrophobic interface of protein-nucleic acid interactions, using DFT and MP2 levels of quantum-mechanical (QM) theory and quantum theory "Atoms in molecules" (QTAIM). Based on the sweeps of the electron-topological, geometric, polar, and energetic parameters, which describe the course of the G·C ↔ G*·C* tautomerization (mutagenic tautomers of the G and C bases are marked with an asterisk) through the DPT along the intrinsic reaction coordinate (IRC), it was proved that it is, strictly speaking, a concerted asynchronous process both at the DFT and MP2 levels of theory, in which protons move with a small time gap in vacuum, while this time delay noticeably increases in the continuum with ϵ = 4. It was demonstrated using the conductor-like polarizable continuum model (CPCM) that the continuum with ϵ = 4 does not qualitatively affect the course of the tautomerization reaction. The DPT in the G·C Watson-Crick base pair occurs without any intermediates both in vacuum and in the continuum with ϵ = 4 at the DFT/MP2 levels of theory. The nine key points along the IRC of the G·C base pair tautomerization, which could be considered as electron-topological "fingerprints" of a concerted asynchronous process of the tautomerization via the DPT, have been identified and fully characterized. These key points have been used to define the reactant, transition state, and product regions of the DPT reaction in the G·C base pair. Analysis of the energetic characteristics of the H-bonds allows us to arrive at a definite conclusion that the middle N1H⋯N3/N3H⋯N1 and the lower N2H⋯O2/N2H⋯O2 parallel H-bonds in the G·C/G*·C* base pairs, respectively, are anticooperative, that is, the strengthening of the middle H-bond is accompanied by the weakening of the lower H-bond. At that point, the upper N4H⋯O6 and O6H⋯N4 H-bonds in the G·C and G*·C* base pairs, respectively, remain constant at the changes of the middle and the lower H-bonds at the beginning and at the ending of the G·C ↔ G*·C* tautomerization. Aiming to answer the question posed in the title of the article, we established that the G*·C* Löwdin's base pair satisfies all the requirements necessary to cause point mutations in DNA except its lifetime, which is much less than the period of time required for the replication machinery to forcibly dissociate a base pair into the monomers (several ns) during DNA replication. So, from the physicochemical point of view, the G*·C* Löwdin's base pair cannot be considered as a source of point mutations arising during DNA replication.
NASA Technical Reports Server (NTRS)
Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)
2015-01-01
Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
NASA Technical Reports Server (NTRS)
Bebis, George
2013-01-01
Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
Fischer, Claudia; Voss, Andreas
2014-01-01
Hypertensive pregnancy disorders affect 6 to 8 percent of all pregnancies which can cause severe complications for the mother and the fetus. The aim of this study was to develop a new method suitable for a three dimensional coupling analysis. Therefore, the three-dimensional segmented Poincaré plot analysis (SPPA3) is introduced that represents the Poincare analysis based on a cubic box model representation. The box representing the three dimensional phase space is (based on the SPPA method) subdivided into 12×12×12 equal cubelets according to the predefined range of signals and all single probabilities of occurring points in a specific cubelet related to the total number of points are calculated. From 10 healthy non-pregnant women, 66 healthy pregnant women and 56 hypertensive pregnant women suffering from chronic hypertension, gestational hypertension and preeclampsia, 30 minutes of beat-to-beat intervals (BBI), noninvasive blood pressure and respiration (RESP) were continuously recorded and analyzed. Couplings between the different signals were analyzed. The ability of SPPA3 for a screening could be confirmed by multivariate discriminant analysis differentiating between all pregnant woman and preeclampsia (index BBI3_SBP9_RESP6/ BBI8_SBP11_RESP4 leads to an area under the ROC curve of AUC=91.2%). In conclusion, SPPA3 could be a useful method for enhanced risk stratification in pregnant women.
Wikipedias: Collaborative web-based encyclopedias as complex networks
NASA Astrophysics Data System (ADS)
Zlatić, V.; Božičević, M.; Štefančić, H.; Domazet, M.
2006-07-01
Wikipedia is a popular web-based encyclopedia edited freely and collaboratively by its users. In this paper we present an analysis of Wikipedias in several languages as complex networks. The hyperlinks pointing from one Wikipedia article to another are treated as directed links while the articles represent the nodes of the network. We show that many network characteristics are common to different language versions of Wikipedia, such as their degree distributions, growth, topology, reciprocity, clustering, assortativity, path lengths, and triad significance profiles. These regularities, found in the ensemble of Wikipedias in different languages and of different sizes, point to the existence of a unique growth process. We also compare Wikipedias to other previously studied networks.
Wikipedias: collaborative web-based encyclopedias as complex networks.
Zlatić, V; Bozicević, M; Stefancić, H; Domazet, M
2006-07-01
Wikipedia is a popular web-based encyclopedia edited freely and collaboratively by its users. In this paper we present an analysis of Wikipedias in several languages as complex networks. The hyperlinks pointing from one Wikipedia article to another are treated as directed links while the articles represent the nodes of the network. We show that many network characteristics are common to different language versions of Wikipedia, such as their degree distributions, growth, topology, reciprocity, clustering, assortativity, path lengths, and triad significance profiles. These regularities, found in the ensemble of Wikipedias in different languages and of different sizes, point to the existence of a unique growth process. We also compare Wikipedias to other previously studied networks.
Sadeghi, Tabandeh; Seyed Bagheri, Seyed Hamid
2017-01-01
Clinical evaluation is very important in the educational system of nursing. One of the most common methods of clinical evaluation is evaluation by the teacher, but the challenges that students would face in this evaluation method, have not been mentioned. Thus, this study aimed to explore the experiences and views of nursing students about the challenges of teacher-based clinical evaluation. This study was a descriptive qualitative study with a qualitative content analysis approach. Data were gathered through semi-structured focused group sessions with undergraduate nursing students who were passing their 8 th semester at Rafsanjan University of Medical Sciences. Date were analyzed using Graneheim and Lundman's proposed method. Data collection and analysis were concurrent. According to the findings, "factitious evaluation" was the main theme of study that consisted of three categories: "Personal preferences," "unfairness" and "shirking responsibility." These categories are explained using quotes derived from the data. According to the results of this study, teacher-based clinical evaluation would lead to factitious evaluation. Thus, changing this approach of evaluation toward modern methods of evaluation is suggested. The finding can help nursing instructors to get a better understanding of the nursing students' point of view toward this evaluation approach and as a result could be planning for changing of this approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London
2013-01-01
Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less
Pitcher, Alex; Emberson, Jonathan; Lacro, Ronald V.; Sleeper, Lynn A.; Stylianou, Mario; Mahony, Lynn; Pearson, Gail D.; Groenink, Maarten; Mulder, Barbara J.; Zwinderman, Aeilko H.; De Backer, Julie; De Paepe, Anne M.; Arbustini, Eloisa; Erdem, Guliz; Jin, Xu Yu; Flather, Marcus D.; Mullen, Michael J.; Child, Anne H.; Forteza, Alberto; Evangelista, Arturo; Chiu, Hsin-Hui; Wu, Mei-Hwan; Sandor, George; Bhatt, Ami B.; Creager, Mark A.; Devereux, Richard B.; Loeys, Bart; Forfar, J. Colin; Neubauer, Stefan; Watkins, Hugh; Boileau, Catherine; Jondeau, Guillaume; Dietz, Harry C.; Baigent, Colin
2015-01-01
Rationale A number of randomized trials are underway, which will address the effects of angiotensin receptor blockers (ARBs) on aortic root enlargement and a range of other end points in patients with Marfan syndrome. If individual participant data from these trials were to be combined, a meta-analysis of the resulting data, totaling approximately 2,300 patients, would allow estimation across a number of trials of the treatment effects both of ARB therapy and of β-blockade. Such an analysis would also allow estimation of treatment effects in particular subgroups of patients on a range of end points of interest and would allow a more powerful estimate of the effects of these treatments on a composite end point of several clinical outcomes than would be available from any individual trial. Design A prospective, collaborative meta-analysis based on individual patient data from all randomized trials in Marfan syndrome of (i) ARBs versus placebo (or open-label control) and (ii) ARBs versus β-blockers will be performed. A prospective study design, in which the principal hypotheses, trial eligibility criteria, analyses, and methods are specified in advance of the unblinding of the component trials, will help to limit bias owing to data-dependent emphasis on the results of particular trials. The use of individual patient data will allow for analysis of the effects of ARBs in particular patient subgroups and for time-to-event analysis for clinical outcomes. The meta-analysis protocol summarized in this report was written on behalf of the Marfan Treatment Trialists' Collaboration and finalized in late 2012, without foreknowledge of the results of any component trial, and will be made available online (http://www.ctsu.ox.ac.uk/research/meta-trials). PMID:25965707
Analysis of separation test for automatic brake adjuster based on linear radon transformation
NASA Astrophysics Data System (ADS)
Luo, Zai; Jiang, Wensong; Guo, Bin; Fan, Weijun; Lu, Yi
2015-01-01
The linear Radon transformation is applied to extract inflection points for online test system under the noise conditions. The linear Radon transformation has a strong ability of anti-noise and anti-interference by fitting the online test curve in several parts, which makes it easy to handle consecutive inflection points. We applied the linear Radon transformation to the separation test system to solve the separating clearance of automatic brake adjuster. The experimental results show that the feature point extraction error of the gradient maximum optimal method is approximately equal to ±0.100, while the feature point extraction error of linear Radon transformation method can reach to ±0.010, which has a lower error than the former one. In addition, the linear Radon transformation is robust.
Digital analyzer for point processes based on first-in-first-out memories
NASA Astrophysics Data System (ADS)
Basano, Lorenzo; Ottonello, Pasquale; Schiavi, Enore
1992-06-01
We present an entirely new version of a multipurpose instrument designed for the statistical analysis of point processes, especially those characterized by high bunching. A long sequence of pulses can be recorded in the RAM bank of a personal computer via a suitably designed front end which employs a pair of first-in-first-out (FIFO) memories; these allow one to build an analyzer that, besides being simpler from the electronic point of view, is capable of sustaining much higher intensity fluctuations of the point process. The overflow risk of the device is evaluated by treating the FIFO pair as a queueing system. The apparatus was tested using both a deterministic signal and a sequence of photoelectrons obtained from laser light scattered by random surfaces.
Asher, Anthony L; Kerezoudis, Panagiotis; Mummaneni, Praveen V; Bisson, Erica F; Glassman, Steven D; Foley, Kevin T; Slotkin, Jonathan; Potts, Eric A; Shaffrey, Mark E; Shaffrey, Christopher I; Coric, Domagoj; Knightly, John J; Park, Paul; Fu, Kai-Ming; Devin, Clinton J; Archer, Kristin R; Chotai, Silky; Chan, Andrew K; Virk, Michael S; Bydon, Mohamad
2018-01-01
OBJECTIVE Patient-reported outcomes (PROs) play a pivotal role in defining the value of surgical interventions for spinal disease. The concept of minimum clinically important difference (MCID) is considered the new standard for determining the effectiveness of a given treatment and describing patient satisfaction in response to that treatment. The purpose of this study was to determine the MCID associated with surgical treatment for degenerative lumbar spondylolisthesis. METHODS The authors queried the Quality Outcomes Database registry from July 2014 through December 2015 for patients who underwent posterior lumbar surgery for grade I degenerative spondylolisthesis. Recorded PROs included scores on the Oswestry Disability Index (ODI), EQ-5D, and numeric rating scale (NRS) for leg pain (NRS-LP) and back pain (NRS-BP). Anchor-based (using the North American Spine Society satisfaction scale) and distribution-based (half a standard deviation, small Cohen's effect size, standard error of measurement, and minimum detectable change [MDC]) methods were used to calculate the MCID for each PRO. RESULTS A total of 441 patients (80 who underwent laminectomies alone and 361 who underwent fusion procedures) from 11 participating sites were included in the analysis. The changes in functional outcome scores between baseline and the 1-year postoperative evaluation were as follows: 23.5 ± 17.4 points for ODI, 0.24 ± 0.23 for EQ-5D, 4.1 ± 3.5 for NRS-LP, and 3.7 ± 3.2 for NRS-BP. The different calculation methods generated a range of MCID values for each PRO: 3.3-26.5 points for ODI, 0.04-0.3 points for EQ-5D, 0.6-4.5 points for NRS-LP, and 0.5-4.2 points for NRS-BP. The MDC approach appeared to be the most appropriate for calculating MCID because it provided a threshold greater than the measurement error and was closest to the average change difference between the satisfied and not-satisfied patients. On subgroup analysis, the MCID thresholds for laminectomy-alone patients were comparable to those for the patients who underwent arthrodesis as well as for the entire cohort. CONCLUSIONS The MCID for PROs was highly variable depending on the calculation technique. The MDC seems to be a statistically and clinically sound method for defining the appropriate MCID value for patients with grade I degenerative lumbar spondylolisthesis. Based on this method, the MCID values are 14.3 points for ODI, 0.2 points for EQ-5D, 1.7 points for NRS-LP, and 1.6 points for NRS-BP.
Applications of 3D-EDGE Detection for ALS Point Cloud
NASA Astrophysics Data System (ADS)
Ni, H.; Lin, X. G.; Zhang, J. X.
2017-09-01
Edge detection has been one of the major issues in the field of remote sensing and photogrammetry. With the fast development of sensor technology of laser scanning system, dense point clouds have become increasingly common. Precious 3D-edges are able to be detected from these point clouds and a great deal of edge or feature line extraction methods have been proposed. Among these methods, an easy-to-use 3D-edge detection method, AGPN (Analyzing Geometric Properties of Neighborhoods), has been proposed. The AGPN method detects edges based on the analysis of geometric properties of a query point's neighbourhood. The AGPN method detects two kinds of 3D-edges, including boundary elements and fold edges, and it has many applications. This paper presents three applications of AGPN, i.e., 3D line segment extraction, ground points filtering, and ground breakline extraction. Experiments show that the utilization of AGPN method gives a straightforward solution to these applications.
Experimental and numerical analysis of metal leaching from fly ash-amended highway bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetin, Bora; Aydilek, Ahmet H., E-mail: aydilek@umd.edu; Li, Lin
2012-05-15
Highlights: Black-Right-Pointing-Pointer This study is the evaluation of leaching potential of fly ash-lime mixed soils. Black-Right-Pointing-Pointer This objective is met with experimental and numerical analysis. Black-Right-Pointing-Pointer Zn leaching decreases with increase in fly ash content while Ba, B, Cu increases. Black-Right-Pointing-Pointer Decrease in lime content promoted leaching of Ba, B and Cu while Zn increases. Black-Right-Pointing-Pointer Numerical analysis predicted lower field metal concentrations. - Abstract: A study was conducted to evaluate the leaching potential of unpaved road materials (URM) mixed with lime activated high carbon fly ashes and to evaluate groundwater impacts of barium, boron, copper, and zinc leaching. Thismore » objective was met by a combination of batch water leach tests, column leach tests, and computer modeling. The laboratory tests were conducted on soil alone, fly ash alone, and URM-fly ash-lime kiln dust mixtures. The results indicated that an increase in fly ash and lime content has significant effects on leaching behavior of heavy metals from URM-fly ash mixture. An increase in fly ash content and a decrease in lime content promoted leaching of Ba, B and Cu whereas Zn leaching was primarily affected by the fly ash content. Numerically predicted field metal concentrations were significantly lower than the peak metal concentrations obtained in laboratory column leach tests, and field concentrations decreased with time and distance due to dispersion in soil vadose zone.« less
Position and volume estimation of atmospheric nuclear detonations from video reconstruction
NASA Astrophysics Data System (ADS)
Schmitt, Daniel T.
Recent work in digitizing films of foundational atmospheric nuclear detonations from the 1950s provides an opportunity to perform deeper analysis on these historical tests. This work leverages multi-view geometry and computer vision techniques to provide an automated means to perform three-dimensional analysis of the blasts for several points in time. The accomplishment of this requires careful alignment of the films in time, detection of features in the images, matching of features, and multi-view reconstruction. Sub-explosion features can be detected with a 67% hit rate and 22% false alarm rate. Hotspot features can be detected with a 71.95% hit rate, 86.03% precision and a 0.015% false positive rate. Detected hotspots are matched across 57-109 degree viewpoints with 76.63% average correct matching by defining their location relative to the center of the explosion, rotating them to the alternative viewpoint, and matching them collectively. When 3D reconstruction is applied to the hotspot matching it completes an automated process that has been used to create 168 3D point clouds with 31.6 points per reconstruction with each point having an accuracy of 0.62 meters with 0.35, 0.24, and 0.34 meters of accuracy in the x-, y- and z-direction respectively. As a demonstration of using the point clouds for analysis, volumes are estimated and shown to be consistent with radius-based models and in some cases improve on the level of uncertainty in the yield calculation.
Evaluating Alternatives for Drinking Water at Deployed Locations
2006-03-01
Tucker and Sands, 1999; Beering , 2002). 1986 Plutonium was found in the New York city drinking water system. Though the concentrations were...based approach called Hazard Analysis and Critical Control Point ( HACCP ). This approach holds that avoidance is practical and effective where other
Multi-scale trends analysis of landscape stressors in an urbanizing coastal watershed
Anthropogenic land based stressors within a watershed can deliver major impacts to downstream and adjacent coastal waterways affecting water quality and estuarine habitats. Our research focused on a subset of non-point sources of watershed stressors specifically, human population...
NASA Astrophysics Data System (ADS)
Langhammer, Jakub; Lendzioch, Theodora; Mirijovsky, Jakub
2016-04-01
Granulometric analysis represents a traditional, important and for the description of sedimentary material substantial method with various applications in sedimentology, hydrology and geomorphology. However, the conventional granulometric field survey methods are time consuming, laborious, costly and are invasive to the surface being sampled, which can be limiting factor for their applicability in protected areas.. The optical granulometry has recently emerged as an image analysis technique, enabling non-invasive survey, employing semi-automated identification of clasts from calibrated digital imagery, taken on site by conventional high resolution digital camera and calibrated frame. The image processing allows detection and measurement of mixed size natural grains, their sorting and quantitative analysis using standard granulometric approaches. Despite known limitations, the technique today presents reliable tool, significantly easing and speeding the field survey in fluvial geomorphology. However, the nature of such survey has still limitations in spatial coverage of the sites and applicability in research at multitemporal scale. In our study, we are presenting novel approach, based on fusion of two image analysis techniques - optical granulometry and UAV-based photogrammetry, allowing to bridge the gap between the needs of high resolution structural information for granulometric analysis and spatially accurate and data coverage. We have developed and tested a workflow that, using UAV imaging platform enabling to deliver seamless, high resolution and spatially accurate imagery of the study site from which can be derived the granulometric properties of the sedimentary material. We have set up a workflow modeling chain, providing (i) the optimum flight parameters for UAV imagery to balance the two key divergent requirements - imagery resolution and seamless spatial coverage, (ii) the workflow for the processing of UAV acquired imagery by means of the optical granulometry and (iii) the workflow for analysis of spatial distribution and temporal changes of granulometric properties across the point bar. The proposed technique was tested on a case study of an active point bar of mid-latitude mountain stream at Sumava mountains, Czech Republic, exposed to repeated flooding. The UAV photogrammetry was used to acquire very high resolution imagery to build high-precision digital terrain models and orthoimage. The orthoimage was then analyzed using the digital optical granulometric tool BaseGrain. This approach allowed us (i) to analyze the spatial distribution of the grain size in a seamless transects over an active point bar and (ii) to assess the multitemporal changes of granulometric properties of the point bar material resulting from flooding. The tested framework prove the applicability of the proposed method for granulometric analysis with accuracy comparable with field optical granulometry. The seamless nature of the data enables to study spatial distribution of granulometric properties across the study sites as well as the analysis of multitemporal changes, resulting from repeated imaging.
Hermanrud, Christina; Ryner, Malin; Luft, Thomas; Jensen, Poul Erik; Ingenhoven, Kathleen; Rat, Dorothea; Deisenhammer, Florian; Sørensen, Per Soelberg; Pallardy, Marc; Sikkema, Dan; Bertotti, Elisa; Kramer, Daniel; Creeke, Paul; Fogdell-Hahn, Anna
2016-03-01
Neutralizing anti-drug antibodies (NAbs) against therapeutic interferon beta (IFNβ) in people with multiple sclerosis (MS) are measured with cell-based bioassays. The aim of this study was to redevelop and validate two luciferase reporter-gene bioassays, LUC and iLite, using a cut-point approach to identify NAb positive samples. Such an approach is favored by the pharmaceutical industry and governmental regulatory agencies as it has a clear statistical basis and overcomes the limitations of the current assays based on the Kawade principle. The work was conducted following the latest assay guidelines. The assays were re-developed and validated as part of the "Anti-Biopharmaceutical Immunization: Prediction and analysis of clinical relevance to minimize the risk" (ABIRISK) consortium and involved a joint collaboration between four academic laboratories and two pharmaceutical companies. The LUC assay was validated at Innsbruck Medical University (LUCIMU) and at Rigshospitalet (LUCRH) Copenhagen, and the iLite assay at Karolinska Institutet, Stockholm. For both assays, the optimal serum sample concentration in relation to sensitivity and recovery was 2.5% (v/v) in assay media. A Shapiro-Wilk test indicated a normal distribution for the majority of runs, allowing a parametric approach for cut-point calculation to be used, where NAb positive samples could be identified with 95% confidence. An analysis of means and variances indicated that a floating cut-point should be used for all assays. The assays demonstrated acceptable sensitivity for being cell-based assays, with a confirmed limit of detection in neat serum of 1519 ng/mL for LUCIMU, 814 ng/mL for LUCRH, and 320 ng/mL for iLite. Use of the validated cut-point assay, in comparison with the previously used Kawade method, identified 14% more NAb positive samples. In conclusion, implementation of the cut-point design resulted in increased sensitivity to detect NAbs. However, the clinical significance of these low positive titers needs to be further evaluated. Copyright © 2016 Elsevier B.V. All rights reserved.
D'Agostino, Emily M; Patel, Hersila H; Hansen, Eric; Mathew, M Sunil; Nardi, Maria; Messiah, Sarah E
2018-03-01
The WHO calls for affordable population-based prevention strategies for reducing the global burden of cardiovascular disease (CVD) on morbidity and mortality; however, effective, sustainable and accessible community-based approaches for CVD prevention in at-risk youth have yet to be identified. We examined the effects of implementing a daily park-based afterschool fitness programme on youth CVD risk profiles over 5 years and across area poverty subgroups. The study included 2264 youth (mean age 9.4 years, 54% male, 50% Hispanic, 47% non-Hispanic black, 70% high/very high area poverty) in Miami, Florida, USA. We used three-level repeated measures mixed models to determine the longitudinal effects of programme participation on modifiable CVD outcomes (2010-2016). Duration of programme participation was significantly associated with CVD risk profile improvements, including body mass index (BMI) z-score, diastolic/systolic blood pressure, skinfold thicknesses, waist-hip ratio, sit-ups, push-ups, Progressive Aerobic Cardiovascular Endurance Run (PACER) score, 400 m run time, probability of developing systolic/diastolic hypertension and overweight/obesity in high/very high poverty neighbourhoods (P<0.001). Diastolic blood pressure decreased 3.4 percentile points (95% CI -5.85 to -0.85), 8.1 percentile points (95% CI -11.98 to -4.26), 6.1 percentile points (95% CI -11.49 to -0.66), 7.6 percentile points (95% CI -15.33 to -0.15) and 11.4 percentile points (95% CI -25.32 to 2.61) for 1-5 years, respectively, in high/very high poverty areas. In contrast, significant improvements were found only for PACER score and waist-hip ratio in low/mid poverty areas. This analysis presents compelling evidence demonstrating that park-based afterschool programmes can successfully maintain or improve at-risk youth CVD profiles over multiple years. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Likelihood ratio meta-analysis: New motivation and approach for an old method.
Dormuth, Colin R; Filion, Kristian B; Platt, Robert W
2016-03-01
A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.
An automated smartphone-based diagnostic assay for point-of-care semen analysis.
Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L; Draz, Mohamed Shehata; Petrozza, John C; Shafiee, Hadi
2017-03-22
Male infertility affects up to 12% of the world's male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with <5-s mean processing time and provide the user a semen quality evaluation based on the World Health Organization (WHO) guidelines with ~98% accuracy. The work suggests that the integration of microfluidics, optical sensing accessories, and advances in consumer electronics, particularly smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. Copyright © 2017, American Association for the Advancement of Science.
Rojo-Manaute, Jose Manuel; Capa-Grasa, Alberto; Del Cerro-Gutiérrez, Miguel; Martínez, Manuel Villanueva; Chana-Rodríguez, Francisco; Martín, Javier Vaquero
2012-03-01
Trigger digit surgery can be performed by an open approach using classic open surgery, by a wide-awake approach, or by sonographically guided first annular pulley release in day surgery and office-based ambulatory settings. Our goal was to perform a turnover and economic analysis of 3 surgical models. Two studies were conducted. The first was a turnover analysis of 57 patients allocated 4:4:1 into the surgical models: sonographically guided-office-based, classic open-day surgery, and wide-awake-office-based. Regression analysis for the turnover time was monitored for assessing stability (R(2) < .26). Second, on the basis of turnover times and hospital tariff revenues, we calculated the total costs, income to cost ratio, opportunity cost, true cost, true net income (primary variable), break-even points for sonographically guided fixed costs, and 1-way analysis for identifying thresholds among alternatives. Thirteen sonographically guided-office-based patients were withdrawn because of a learning curve influence. The wide-awake (n = 6) and classic (n = 26) models were compared to the last 25% of the sonographically guided group (n = 12), which showed significantly less mean turnover times, income to cost ratios 2.52 and 10.9 times larger, and true costs 75.48 and 20.92 times lower, respectively. A true net income break-even point happened after 19.78 sonographically guided-office-based procedures. Sensitivity analysis showed a threshold between wide-awake and last 25% sonographically guided true costs if the last 25% sonographically guided turnover times reached 65.23 and 27.81 minutes, respectively. However, this trial was underpowered. This trial comparing surgical models was underpowered and is inconclusive on turnover times; however, the sonographically guided-office-based approach showed shorter turnover times and better economic results with a quick recoup of the costs of sonographically assisted surgery.
Spacecraft Maneuvering at the Sun/Earth-Moon L2 Libration Point
NASA Astrophysics Data System (ADS)
Shahid, Kamran
Spacecraft formation flying in the vicinity of the Sun/Earth-Moon libration points offers many promising possibilities for space exploration. The concept of formation flying involves the distribution of the functionality of a single spacecraft among several smaller, cooperative spacecraft. The libration points are locations relative to two large orbiting bodies where a third body with relatively small mass can remain stationary relative to the two larger bodies. The most significant perturbation experienced by a spacecraft at the libration point is effect of solar radiation pressure. This thesis presents the development of nonlinear control techniques for maneuvering control at the Sun-Earth/Moon L2 libration point. A new thruster based formation control technique is presented. We also consider a leader/follower formation architecture, and examine the station keeping control of the leader spacecraft and the formation control of the follower spacecraft using solar radiation pressure. Reference trajectories of the leader spacecraft, halo and Lissajous orbits, are determined using a numerical technique in order to take into account all major gravitational perturbations. The nonlinear controllers are developed based on Lyapunov analysis, including non-adaptive and adaptive designs. Thruster based and solar radiation pressure based control laws for spacecraft maneuvering at the Sun-Earth/Moon libration point are developed. Higher order sliding mode control is utilized to address the non-affine structure of the solar sail control inputs. The reduced input solar radiation pressure problem is properly addressed as an underactuated control problem. The development of adaptive control for solar sail equipped spacecraft is an innovation and represents and advancement in solar sailing control technology. Controller performance is evaluated in a high fidelity ephemeris model to reflect a realistic simulated space environment. The numerical results demonstrate the effectiveness of the proposed control techniques for spacecraft maneuvering using solar radiation pressure at the L2 libration point. Stationkeeping accuracies of 50m and formation maintenance accuracies of less than 1m are possible using solar radiation pressure at a sub-L2 libration point. The benefits of these control techniques include increasing libration point mission lifetimes and doubling payload mass fractions as compared to conventional propulsion methods.
NASA Technical Reports Server (NTRS)
Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.
Patwary, Nurmohammed; Preza, Chrysanthe
2015-01-01
A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634
Access point selection game with mobile users using correlated equilibrium.
Sohn, Insoo
2015-01-01
One of the most important issues in wireless local area network (WLAN) systems with multiple access points (APs) is the AP selection problem. Game theory is a mathematical tool used to analyze the interactions in multiplayer systems and has been applied to various problems in wireless networks. Correlated equilibrium (CE) is one of the powerful game theory solution concepts, which is more general than the Nash equilibrium for analyzing the interactions in multiplayer mixed strategy games. A game-theoretic formulation of the AP selection problem with mobile users is presented using a novel scheme based on a regret-based learning procedure. Through convergence analysis, we show that the joint actions based on the proposed algorithm achieve CE. Simulation results illustrate that the proposed algorithm is effective in a realistic WLAN environment with user mobility and achieves maximum system throughput based on the game-theoretic formulation.
Access Point Selection Game with Mobile Users Using Correlated Equilibrium
Sohn, Insoo
2015-01-01
One of the most important issues in wireless local area network (WLAN) systems with multiple access points (APs) is the AP selection problem. Game theory is a mathematical tool used to analyze the interactions in multiplayer systems and has been applied to various problems in wireless networks. Correlated equilibrium (CE) is one of the powerful game theory solution concepts, which is more general than the Nash equilibrium for analyzing the interactions in multiplayer mixed strategy games. A game-theoretic formulation of the AP selection problem with mobile users is presented using a novel scheme based on a regret-based learning procedure. Through convergence analysis, we show that the joint actions based on the proposed algorithm achieve CE. Simulation results illustrate that the proposed algorithm is effective in a realistic WLAN environment with user mobility and achieves maximum system throughput based on the game-theoretic formulation. PMID:25785726
Bennett, Michael I; Bagnall, Anne-Marie; José Closs, S
2009-06-01
This review aimed to quantify the benefit of patient-based educational interventions in the management of cancer pain. We undertook a systematic review and meta-analysis of experimentally randomised and non-randomised controlled clinical trials identified from six databases from inception to November 2007.Two reviewers independently selected trials comparing intervention (formal instruction on cancer pain and analgesia on an individual basis using any medium) to usual care or other control in adults with cancer pain. Methodological quality was assessed, and data extraction undertaken by one reviewer with a second reviewer checking for accuracy. We used random effects model to combine the effect estimates from studies. Main outcome measures were effects on knowledge and attitudes towards cancer pain and analgesia, and pain intensity. Twenty-one trials (19 randomised) totalling 3501 patients met inclusion criteria, and 15 were included in the meta-analysis. Compared to usual care or control, educational interventions improved knowledge and attitudes by half a point on 0-5 rating scale (weighted mean difference 0.52, 95% confidence interval 0.04-1.0), reduced average pain intensity by over one point on 0-10 rating scale (WMD -1.1, -1.8 to -0.41) and reduced worst pain intensity by just under one point (WMD -0.78, -1.21 to -0.35). We found equivocal evidence for the effect of education on self-efficacy, but no significant benefit on medication adherence or on reducing interference with daily activities. Patient-based educational interventions can result in modest but significant benefits in the management of cancer pain, and are probably underused alongside more traditional analgesic approaches.
Comparing topography-based verbal behavior with stimulus selection-based verbal behavior
Sundberg, Carl T.; Sundberg, Mark L.
1990-01-01
Michael (1985) distinguished between two types of verbal behavior: topography-based and stimulus selection-based verbal behavior. The current research was designed to empirically examine these two types of verbal behavior while addressing the frequently debated question, Which augmentative communication system should be used with the nonverbal developmentally disabled person? Four mentally retarded adults served as subjects. Each subject was taught to tact an object by either pointing to its corresponding symbol (selection-based verbal behavior), or making the corresponding sign (topography-based verbal behavior). They were then taught an intraverbal relation, and were tested for the emergence of stimulus equivalence relations. The results showed that signed responses were acquired more readily than pointing responses as measured by the acquisition of tacts and intraverbals, and the formation of equivalence classes. These results support Michael's (1985) analysis, and have important implications for the design of language intervention programs for the developmentally disabled. ImagesFig. 1Fig. 2 PMID:22477602
Continuously Deformation Monitoring of Subway Tunnel Based on Terrestrial Point Clouds
NASA Astrophysics Data System (ADS)
Kang, Z.; Tuo, L.; Zlatanova, S.
2012-07-01
The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the common control points can be used by each station and thus the error accumulation avoided within a section. Afterwards, the central axis of the subway tunnel is determined through RANSAC (Random Sample Consensus) algorithm and curve fitting. Although with very high resolution, laser points are still discrete and thus the vertical section is computed via the quadric fitting of the vicinity of interest, instead of the fitting of the whole model of a subway tunnel, which is determined by the intersection line rotated about the central axis of tunnel within a vertical plane. The extraction of the vertical section is then optimized using RANSAC for the purpose of filtering out noises. Based on the extracted vertical sections, the volume of tunnel deformation is estimated by the comparison between vertical sections extracted at the same position from different epochs of point clouds. Furthermore, the continuously extracted vertical sections are deployed to evaluate the convergent tendency of the tunnel. The proposed algorithms are verified using real datasets in terms of accuracy and computation efficiency. The experimental result of fitting accuracy analysis shows the maximum deviation between interpolated point and real point is 1.5 mm, and the minimum one is 0.1 mm; the convergent tendency of the tunnel was detected by the comparison of adjacent fitting radius. The maximum error is 6 mm, while the minimum one is 1 mm. The computation cost of vertical section abstraction is within 3 seconds/section, which proves high efficiency..
NASA Astrophysics Data System (ADS)
Harmening, Corinna; Neuner, Hans
2016-09-01
Due to the establishment of terrestrial laser scanner, the analysis strategies in engineering geodesy change from pointwise approaches to areal ones. These areal analysis strategies are commonly built on the modelling of the acquired point clouds. Freeform curves and surfaces like B-spline curves/surfaces are one possible approach to obtain space continuous information. A variety of parameters determines the B-spline's appearance; the B-spline's complexity is mostly determined by the number of control points. Usually, this number of control points is chosen quite arbitrarily by intuitive trial-and-error-procedures. In this paper, the Akaike Information Criterion and the Bayesian Information Criterion are investigated with regard to a justified and reproducible choice of the optimal number of control points of B-spline curves. Additionally, we develop a method which is based on the structural risk minimization of the statistical learning theory. Unlike the Akaike and the Bayesian Information Criteria this method doesn't use the number of parameters as complexity measure of the approximating functions but their Vapnik-Chervonenkis-dimension. Furthermore, it is also valid for non-linear models. Thus, the three methods differ in their target function to be minimized and consequently in their definition of optimality. The present paper will be continued by a second paper dealing with the choice of the optimal number of control points of B-spline surfaces.
Crossfit analysis: a novel method to characterize the dynamics of induced plant responses.
Jansen, Jeroen J; van Dam, Nicole M; Hoefsloot, Huub C J; Smilde, Age K
2009-12-16
Many plant species show induced responses that protect them against exogenous attacks. These responses involve the production of many different bioactive compounds. Plant species belonging to the Brassicaceae family produce defensive glucosinolates, which may greatly influence their favorable nutritional properties for humans. Each responding compound may have its own dynamic profile and metabolic relationships with other compounds. The chemical background of the induced response is therefore highly complex and may therefore not reveal all the properties of the response in any single model. This study therefore aims to describe the dynamics of the glucosinolate response, measured at three time points after induction in a feral Brassica, by a three-faceted approach, based on Principal Component Analysis. First the large-scale aspects of the response are described in a 'global model' and then each time-point in the experiment is individually described in 'local models' that focus on phenomena that occur at specific moments in time. Although each local model describes the variation among the plants at one time-point as well as possible, the response dynamics are lost. Therefore a novel method called the 'Crossfit' is described that links the local models of different time-points to each other. Each element of the described analysis approach reveals different aspects of the response. The crossfit shows that smaller dynamic changes may occur in the response that are overlooked by global models, as illustrated by the analysis of a metabolic profiling dataset of the same samples.
Crossfit analysis: a novel method to characterize the dynamics of induced plant responses
2009-01-01
Background Many plant species show induced responses that protect them against exogenous attacks. These responses involve the production of many different bioactive compounds. Plant species belonging to the Brassicaceae family produce defensive glucosinolates, which may greatly influence their favorable nutritional properties for humans. Each responding compound may have its own dynamic profile and metabolic relationships with other compounds. The chemical background of the induced response is therefore highly complex and may therefore not reveal all the properties of the response in any single model. Results This study therefore aims to describe the dynamics of the glucosinolate response, measured at three time points after induction in a feral Brassica, by a three-faceted approach, based on Principal Component Analysis. First the large-scale aspects of the response are described in a 'global model' and then each time-point in the experiment is individually described in 'local models' that focus on phenomena that occur at specific moments in time. Although each local model describes the variation among the plants at one time-point as well as possible, the response dynamics are lost. Therefore a novel method called the 'Crossfit' is described that links the local models of different time-points to each other. Conclusions Each element of the described analysis approach reveals different aspects of the response. The crossfit shows that smaller dynamic changes may occur in the response that are overlooked by global models, as illustrated by the analysis of a metabolic profiling dataset of the same samples. PMID:20015363
Dynamical analysis of a fractional SIR model with birth and death on heterogeneous complex networks
NASA Astrophysics Data System (ADS)
Huo, Jingjing; Zhao, Hongyong
2016-04-01
In this paper, a fractional SIR model with birth and death rates on heterogeneous complex networks is proposed. Firstly, we obtain a threshold value R0 based on the existence of endemic equilibrium point E∗, which completely determines the dynamics of the model. Secondly, by using Lyapunov function and Kirchhoff's matrix tree theorem, the globally asymptotical stability of the disease-free equilibrium point E0 and the endemic equilibrium point E∗ of the model are investigated. That is, when R0 < 1, the disease-free equilibrium point E0 is globally asymptotically stable and the disease always dies out; when R0 > 1, the disease-free equilibrium point E0 becomes unstable and in the meantime there exists a unique endemic equilibrium point E∗, which is globally asymptotically stable and the disease is uniformly persistent. Finally, the effects of various immunization schemes are studied and compared. Numerical simulations are given to demonstrate the main results.
Ou, Huang-Tz; Chen, Yen-Ting; Liu, Ya-Ming; Wu, Jin-Shang
2016-06-01
To assess the cost-effectiveness of metformin-based dual therapies associated with cardiovascular disease (CVD) risk in a Chinese population with type 2 diabetes. We utilized Taiwan's National Health Insurance Research Database (NHIRD) 1997-2011, which is derived from the claims of National Health Insurance, a mandatory-enrollment single-payer system that covers over 99% of Taiwan's population. Four metformin-based dual therapy cohorts were used, namely a reference group of metformin plus sulfonylureas (Metformin-SU) and metformin plus acarbose, metformin plus thiazolidinediones (Metformin-TZD), and metformin plus glinides (Metformin-glinides). Using propensity scores, each subject in a comparison cohort was 1:1 matched to a referent. The effectiveness outcome was CVD risk. Only direct medical costs were included. The Markov chain model was applied to project lifetime outcomes, discounted at 3% per annum. The bootstrapping technique was performed to assess uncertainty in analysis. Metformin-glinides was most cost-effective in the base-case analysis; Metformin-glinides saved $194 USD for one percentage point of reduction in CVD risk, as compared to Metformin-SU. However, for the elderly or those with severe diabetic complications, Metformin-TZD, especially pioglitazone, was more suitable; as compared to Metformin-SU, Metformin-TZD saved $840.1 USD per percentage point of reduction in CVD risk. Among TZDs, Metformin-pioglitazone saved $1831.5 USD per percentage point of associated CVD risk reduction, as compared to Metformin-rosiglitazone. When CVD is considered an important clinical outcome, Metformin-pioglitazone is cost-effective, in particular for the elderly and those with severe diabetic complications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fa, L. N.
2017-11-01
As the important Environmental Interests of Subjects, enterprises behoove to undertake the corresponding responsibility of Pollution Control and Environmental Protection. The current situations in our country, however, appear as the serious lack of enterprise environmental responsibility. Based on the analysis of law and economics, this article reaches the conclusion through game analysis and cost-benefit analysis that the prisoners dilemma of environmental interest game between enterprises is the inherent causes for the serious defect of enterprise environmental responsibility. Meanwhile, at the point of cost-benefit, the externality of environment illegal act results in the imbalanced cost-benefit, lacking of the motivation to control pollution and protect environment in an active way.
Orientational analysis of planar fibre systems observed as a Poisson shot-noise process.
Kärkkäinen, Salme; Lantuéjoul, Christian
2007-10-01
We consider two-dimensional fibrous materials observed as a digital greyscale image. The problem addressed is to estimate the orientation distribution of unobservable thin fibres from a greyscale image modelled by a planar Poisson shot-noise process. The classical stereological approach is not straightforward, because the point intensities of thin fibres along sampling lines may not be observable. For such cases, Kärkkäinen et al. (2001) suggested the use of scaled variograms determined from grey values along sampling lines in several directions. Their method is based on the assumption that the proportion between the scaled variograms and point intensities in all directions of sampling lines is constant. This assumption is proved to be valid asymptotically for Boolean models and dead leaves models, under some regularity conditions. In this work, we derive the scaled variogram and its approximations for a planar Poisson shot-noise process using the modified Bessel function. In the case of reasonable high resolution of the observed image, the scaled variogram has an approximate functional relation to the point intensity, and in the case of high resolution the relation is proportional. As the obtained relations are approximative, they are tested on simulations. The existing orientation analysis method based on the proportional relation is further experimented on images with different resolutions. The new result, the asymptotic proportionality between the scaled variograms and the point intensities for a Poisson shot-noise process, completes the earlier results for the Boolean models and for the dead leaves models.
Automatic detection of lung vessel bifurcation in thoracic CT images
NASA Astrophysics Data System (ADS)
Maduskar, Pragnya; Vikal, Siddharth; Devarakota, Pandu
2011-03-01
Computer-aided diagnosis (CAD) systems for detection of lung nodules have been an active topic of research for last few years. It is desirable that a CAD system should generate very low false positives (FPs) while maintaining high sensitivity. This work aims to reduce the number of false positives occurring at vessel bifurcation point. FPs occur quite frequently on vessel branching point due to its shape which can appear locally spherical due to the intrinsic geometry of intersecting tubular vessel structures combined with partial volume effects and soft tissue attenuation appearance surrounded by parenchyma. We propose a model-based technique for detection of vessel branching points using skeletonization, followed by branch-point analysis. First we perform vessel structure enhancement using a multi-scale Hessian filter to accurately segment tubular structures of various sizes followed by thresholding to get binary vessel structure segmentation [6]. A modified Reebgraph [7] is applied next to extract the critical points of structure and these are joined by a nearest neighbor criterion to obtain complete skeletal model of vessel structure. Finally, the skeletal model is traversed to identify branch points, and extract metrics including individual branch length, number of branches and angle between various branches. Results on 80 sub-volumes consisting of 60 actual vessel-branching and 20 solitary solid nodules show that the algorithm identified correctly vessel branching points for 57 sub-volumes (95% sensitivity) and misclassified 2 nodules as vessel branch. Thus, this technique has potential in explicit identification of vessel branching points for general vessel analysis, and could be useful in false positive reduction in a lung CAD system.
Determination of Steering Wheel Angles during CAR Alignment by Image Analysis Methods
NASA Astrophysics Data System (ADS)
Mueller, M.; Voegtle, T.
2016-06-01
Optical systems for automatic visual inspections are of increasing importance in the field of automation in the industrial domain. A new application is the determination of steering wheel angles during wheel track setting of the final inspection of car manufacturing. The camera has to be positioned outside the car to avoid interruptions of the processes and therefore, oblique images of the steering wheel must be acquired. Three different approaches of computer vision are considered in this paper, i.e. a 2D shape-based matching (by means of a plane to plane rectification of the oblique images and detection of a shape model with a particular rotation), a 3D shape-based matching approach (by means of a series of different perspectives of the spatial shape of the steering wheel derived from a CAD design model) and a point-to-point matching (by means of the extraction of significant elements (e.g. multifunctional buttons) of a steering wheel and a pairwise connection of these points to straight lines). The HALCON system (HALCON, 2016) was used for all software developments and necessary adaptions. As reference a mechanical balance with an accuracy of 0.1° was used. The quality assessment was based on two different approaches, a laboratory test and a test during production process. In the laboratory a standard deviation of ±0.035° (2D shape-based matching), ±0.12° (3D approach) and ±0.029° (point-to-point matching) could be obtained. The field test of 291 measurements (27 cars with varying poses and angles of the steering wheel) results in a detection rate of 100% and ±0.48° (2D matching) and ±0.24° (point-to-point matching). Both methods also fulfil the request of real time processing (three measurements per second).
Finite Element Analysis of Doorframe Structure of Single Oblique Pole Type in Container Crane
NASA Astrophysics Data System (ADS)
Cheng, X. F.; Wu, F. Q.; Tang, G.; Hu, X.
2017-07-01
Compared with the composite type, the single oblique pole type has more advantages, such as simple structure, thrift steel and high safe overhead clearance. The finite element model of the single oblique pole type is established in nodes by ANSYS, and more details are considered when the model is simplified, such as the section of Girder and Boom, torque in Girder and Boom occurred by Machinery house and Trolley, density according to the way of simplification etc. The stress and deformation of ten observation points are compared and analyzed, when the trolley is in nine dangerous positions. Based on the result of analysis, six dangerous points are selected to provide reference for the detection and evaluation of container crane.
An analytic data analysis method for oscillatory slug tests.
Chen, Chia-Shyun
2006-01-01
An analytical data analysis method is developed for slug tests in partially penetrating wells in confined or unconfined aquifers of high hydraulic conductivity. As adapted from the van der Kamp method, the determination of the hydraulic conductivity is based on the occurrence times and the displacements of the extreme points measured from the oscillatory data and their theoretical counterparts available in the literature. This method is applied to two sets of slug test response data presented by Butler et al.: one set shows slow damping with seven discernable extremities, and the other shows rapid damping with three extreme points. The estimates of the hydraulic conductivity obtained by the analytic method are in good agreement with those determined by an available curve-matching technique.
Uematsu, Mikio; Kurosawa, Masahiko
2005-01-01
A generalised and convenient skyshine dose analysis method has been developed based on forward-adjoint folding technique. In the method, the air penetration data were prepared by performing an adjoint DOT3.5 calculation with cylindrical air-over-ground geometry having an adjoint point source (importance of unit flux to dose rate at detection point) in the centre. The accuracy of the present method was certified by comparing with DOT3.5 forward calculation. The adjoint flux data can be used as generalised radiation skyshine data for all sorts of nuclear facilities. Moreover, the present method supplies plenty of energy-angular dependent contribution flux data, which will be useful for detailed shielding design of facilities.
A new method for mapping multidimensional data to lower dimensions
NASA Technical Reports Server (NTRS)
Gowda, K. C.
1983-01-01
A multispectral mapping method is proposed which is based on the new concept of BEND (Bidimensional Effective Normalised Difference). The method, which involves taking one sample point at a time and finding the interrelationships between its features, is found very economical from the point of view of storage and processing time. It has good dimensionality reduction and clustering properties, and is highly suitable for computer analysis of large amounts of data. The transformed values obtained by this procedure are suitable for either a planar 2-space mapping of geological sample points or for making grayscale and color images of geo-terrains. A few examples are given to justify the efficacy of the proposed procedure.
Ziegler, Ildikó; Borbély-Jakab, Judit; Sugó, Lilla; Kovács, Réka J
2017-01-01
In this case study, the principles of quality risk management were applied to review sampling points and monitoring frequencies in the hormonal tableting unit of a formulation development pilot plant. In the cleanroom area, premises of different functions are located. Therefore a general method was established for risk evaluation based on the Hazard Analysis and Critical Control Points (HACCP) method to evaluate these premises (i.e., production area itself and ancillary clean areas) from the point of view of microbial load and state in order to observe whether the existing monitoring program met the emerged advanced monitoring practice. LAY ABSTRACT: In pharmaceutical production, cleanrooms are needed for the manufacturing of final dosage forms of drugs-intended for human or veterinary use-in order to protect the patient's weakened body from further infections. Cleanrooms are premises with a controlled level of contamination that is specified by the number of particles per cubic meter at a specified particle size or number of microorganisms (i.e. microbial count) per surface area. To ensure a low microbial count over time, microorganisms are detected and counted by environmental monitoring methods regularly. It is reasonable to find the easily infected places by risk analysis to make sure the obtained results really represent the state of the whole room. This paper presents a risk analysis method for the optimization of environmental monitoring and verification of the suitability of the method. © PDA, Inc. 2017.
Freiermuth, O; Todorov, A; Bolli, M; Heberer, M
2003-01-01
Scientific journals currently face challenges including cost pressures caused by economic constraints, increasing rivalry among competitors, limited market potential of non-english speaking journals, increasing medical specialization with resulting market fragmentation, and internet-based competition. We therefore analyzed strategic opportunities of the journal Swiss Surgery on the basis of customer surveys and of a market analysis. Swiss surgeons expressed their interest in the continuation of the journal but also indicated their support for changes in its concept and for an increased use of electronic media. An international market analysis points-out the difficulties of national, non-english speaking journals in gaining impact points and in attracting authors and readers of scientific medical articles. Therefore, a journal such as Swiss Surgery should identify and use publication niches. The demand for a concept addressing surgical training including continuous postgraduate education was confirmed by the customers of Swiss Surgery. A corresponding offer does not presently exist in the area and could become the new focus of the journal. This change of concept may have a number of consequences: A journal focusing on surgical training and education should use the results of readers' surveys rather than impact point assignment to evaluate quality. The journal should increasingly use electronic services including data bases, pictures, videos and closed user groups to supplement the print version. At short term, however, the printed version should be continued and not be substituted by the electronic version in order to maintain the established brand "Swiss Surgery".
Joint surface modeling with thin-plate splines.
Boyd, S K; Ronsky, J L; Lichti, D D; Salkauskas, K; Chapman, M A; Salkauskas, D
1999-10-01
Mathematical joint surface models based on experimentally determined data points can be used to investigate joint characteristics such as curvature, congruency, cartilage thickness, joint contact areas, as well as to provide geometric information well suited for finite element analysis. Commonly, surface modeling methods are based on B-splines, which involve tensor products. These methods have had success; however, they are limited due to the complex organizational aspect of working with surface patches, and modeling unordered, scattered experimental data points. An alternative method for mathematical joint surface modeling is presented based on the thin-plate spline (TPS). It has the advantage that it does not involve surface patches, and can model scattered data points without experimental data preparation. An analytical surface was developed and modeled with the TPS to quantify its interpolating and smoothing characteristics. Some limitations of the TPS include discontinuity of curvature at exactly the experimental surface data points, and numerical problems dealing with data sets in excess of 2000 points. However, suggestions for overcoming these limitations are presented. Testing the TPS with real experimental data, the patellofemoral joint of a cat was measured with multistation digital photogrammetry and modeled using the TPS to determine cartilage thicknesses and surface curvature. The cartilage thickness distribution ranged between 100 to 550 microns on the patella, and 100 to 300 microns on the femur. It was found that the TPS was an effective tool for modeling joint surfaces because no preparation of the experimental data points was necessary, and the resulting unique function representing the entire surface does not involve surface patches. A detailed algorithm is presented for implementation of the TPS.
Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia
2017-07-28
Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.
Lang, Paul Z; Thulasi, Praneetha; Khandelwal, Sumitra S; Hafezi, Farhad; Randleman, J Bradley
2018-05-02
To evaluate the correlation between anterior axial curvature difference maps following corneal cross-linking (CXL) for progressive keratoconus obtained from Scheimpflug-based tomography and Placido-based topography. Between-devices reliability analysis of randomized clinical trial data METHODS: Corneal imaging was collected at a single center institution pre-operatively and at 3, 6, and 12 months post-operatively using Scheimpflug-based tomography (Pentacam, Oculus Inc., Lynnwood, WA) and Scanning-slit, Placido-based topography (Orbscan II, Bausch & Lomb, Rochester, NY) in patients with progressive keratoconus receiving standard protocol CXL (3mW/cm 2 for 30 minutes). Regularization index (RI), absolute maximum keratometry (K Max), and change in (ΔK Max) were compared between the two devices at each time point. 51 eyes from 36 patients were evaluated at all time points. values were significantly different at all time points [56.01±5.3D Scheimpflug vs. 55.04±5.1D scanning-slit pre-operatively (p=0.003); 54.58±5.3D Scheimpflug vs. 53.12±4.9D scanning-slit at 12 months (p<0.0001)] but strongly correlated between devices (r=0.90-0.93) at all time points. The devices were not significantly different at any time point for either ΔK Max or RI but were poorly correlated at all time points (r=0.41-0.53 for ΔK Max, r=0.29-0.48 for RI). At 12 months, 95% LOA was 7.51D for absolute, 8.61D for ΔK Max, and 19.86D for RI. Measurements using Scheimpflug and scanning-slit Placido-based technology are correlated but not interchangeable. Both devices appear reasonable for separately monitoring the cornea's response to CXL; however, caution should be used when comparing results obtained with one measuring technology to the other. Copyright © 2018 Elsevier Inc. All rights reserved.
Conversion of Deletions during Recombination in Pneumococcal Transformation
Lefevre, J. C.; Mostachfi, P.; Gasc, A. M.; Guillot, E.; Pasta, F.; Sicard, M.
1989-01-01
Genetic analysis of 16 deletions obtained in the amiA locus of pneumococcus is described. When present on donor DNA, all deletions increased drastically the frequency of wild-type recombinants in two-point crosses. This effect was maximal for deletions longer than 200 bases. It was reduced for heterologies shorter than 76 bases and did not exist for very short deletions. In three-point crosses in which the deletion was localized between two point mutations, we demonstrated that this excess of wild-type recombinants was the result of a genetic conversion. This conversion extended over several scores of bases outside the deletion. Conversion takes place during the heteroduplex stage of recombination. Therefore, in pneumococcal transformation, long heterologies participated in this heteroduplex configuration. As this conversion did not require an active DNA polymerase A gene it is proposed that the mechanism of conversion is not a DNA repair synthesis but involves breakage and ligation between DNA molecules. Conversion of deletions did not require the Hex system of correction of mismatched bases. It differs also from localized conversion. It appears that it is a process that evolved to correct errors of replication which lead to long heterologies and which are not eliminated by other systems. PMID:2599365
The classification of frontal sinus pneumatization patterns by CT-based volumetry.
Yüksel Aslier, Nesibe Gül; Karabay, Nuri; Zeybek, Gülşah; Keskinoğlu, Pembe; Kiray, Amaç; Sütay, Semih; Ecevit, Mustafa Cenk
2016-10-01
We aimed to define the classification of frontal sinus pneumatization patterns according to three-dimensional volume measurements. Datasets of 148 sides of 74 dry skulls were generated by the computerized tomography-based volumetry to measure frontal sinus volumes. The cutoff points for frontal sinus hypoplasia and hyperplasia were tested by ROC curve analysis and the validity of the diagnostic points was measured. The overall frequencies were 4.1, 14.2, 37.2 and 44.5 % for frontal sinus aplasia, hypoplasia, medium size and hyperplasia, respectively. The aplasia was bilateral in all three skulls. Hypoplasia was seen 76 % at the right side and hyperplasia was seen 56 % at the left side. The cutoff points for diagnosing frontal sinus hypoplasia and hyperplasia were '1131.25 mm(3)' (95.2 % sensitivity and 100 % specificity) and '3328.50 mm(3)' (88 % sensitivity and 86 % specificity), respectively. The findings provided in the present study, which define frontal sinus pneumatization patterns by CT-based volumetry, proved that two opposite sides of the frontal sinuses are asymmetric and three-dimensional classification should be developed by CT-based volumetry, because two-dimensional evaluations lack depth measurement.
NASA Astrophysics Data System (ADS)
Deng, S.; Katoh, M.; Takenaka, Y.; Cheung, K.; Ishii, A.; Fujii, N.; Gao, T.
2017-10-01
This study attempted to classify three coniferous and ten broadleaved tree species by combining airborne laser scanning (ALS) data and multispectral images. The study area, located in Nagano, central Japan, is within the broadleaved forests of the Afan Woodland area. A total of 235 trees were surveyed in 2016, and we recorded the species, DBH, and tree height. The geographical position of each tree was collected using a Global Navigation Satellite System (GNSS) device. Tree crowns were manually detected using GNSS position data, field photographs, true-color orthoimages with three bands (red-green-blue, RGB), 3D point clouds, and a canopy height model derived from ALS data. Then a total of 69 features, including 27 image-based and 42 point-based features, were extracted from the RGB images and the ALS data to classify tree species. Finally, the detected tree crowns were classified into two classes for the first level (coniferous and broadleaved trees), four classes for the second level (Pinus densiflora, Larix kaempferi, Cryptomeria japonica, and broadleaved trees), and 13 classes for the third level (three coniferous and ten broadleaved species), using the 27 image-based features, 42 point-based features, all 69 features, and the best combination of features identified using a neighborhood component analysis algorithm, respectively. The overall classification accuracies reached 90 % at the first and second levels but less than 60 % at the third level. The classifications using the best combinations of features had higher accuracies than those using the image-based and point-based features and the combination of all of the 69 features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basu, Banasri; Bandyopadhyay, Pratul; Majumdar, Priyadarshi
We have studied quantum phase transition induced by a quench in different one-dimensional spin systems. Our analysis is based on the dynamical mechanism which envisages nonadiabaticity in the vicinity of the critical point. This causes spin fluctuation which leads to the random fluctuation of the Berry phase factor acquired by a spin state when the ground state of the system evolves in a closed path. The two-point correlation of this phase factor is associated with the probability of the formation of defects. In this framework, we have estimated the density of defects produced in several one-dimensional spin chains. At themore » critical region, the entanglement entropy of a block of L spins with the rest of the system is also estimated which is found to increase logarithmically with L. The dependence on the quench time puts a constraint on the block size L. It is also pointed out that the Lipkin-Meshkov-Glick model in point-splitting regularized form appears as a combination of the XXX model and Ising model with magnetic field in the negative z axis. This unveils the underlying conformal symmetry at criticality which is lost in the sharp point limit. Our analysis shows that the density of defects as well as the scaling behavior of the entanglement entropy follows a universal behavior in all these systems.« less
Concept for a fast analysis method of the energy dissipation at mechanical joints
NASA Astrophysics Data System (ADS)
Wolf, Alexander; Brosius, Alexander
2017-10-01
When designing hybrid parts and structures one major challenge is the design, production and quality assessment of the joining points. While the polymeric composites themselves have excellent material properties, the necessary joints are often the weak link in assembled structures. This paper presents a method of measuring and analysing the energy dissipation at mechanical joining points of hybrid parts. A simplified model is applied based on the characteristic response to different excitation frequencies and amplitudes. The dissipation from damage is the result of relative moments between joining partners und damaged fibres within the composite, whereas the visco-elastic material behaviour causes the intrinsic dissipation. The ambition is to transfer these research findings to the characterisation of mechanical joints in order to quickly assess the general quality of the joint with this non-destructive testing method. The inherent challenge for realising this method is the correct interpretation of the measured energy dissipation and its attribution to either a bad joining point or intrinsic material properties. In this paper the authors present the concept for energy dissipation measurements at different joining points. By inverse analysis a simplified fast semi-analytical model will be developed that allows for a quick basic quality assessment of a given joining point.
Influence of nuclear power unit on decreasing emissions of greenhouse gases
NASA Astrophysics Data System (ADS)
Stanek, Wojciech; Szargut, Jan; Kolenda, Zygmunt; Czarnowska, Lucyna
2015-03-01
The paper presents a comparison of selected power technologies from the point of view of emissions of greenhouse gases. Such evaluation is most often based only on analysis of direct emissions from combustion. However, the direct analysis does not show full picture of the problem as significant emissions of GHG appear also in the process of mining and transportation of fuel. It is demonstrated in the paper that comparison of power technologies from the GHG point of view has to be done using the cumulative calculus covering the whole cycle of fuel mining, processing, transportation and end-use. From this point of view coal technologies are in comparable level as gas technologies while nuclear power units are characterised with lowest GHG emissions. Mentioned technologies are compared from the point of view of GHG emissions in full cycle. Specific GHG cumulative emission factors per unit of generated electricity are determined. These factors have been applied to simulation of the influence of introduction of nuclear power units on decrease of GHG emissions in domestic scale. Within the presented simulations the prognosis of domestic power sector development according to the Polish energy policy till 2030 has been taken into account. The profitability of introduction of nuclear power units from the point of view of decreasing GHG emissions has been proved.
A novel mesh processing based technique for 3D plant analysis
2012-01-01
Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean absolute errors of 9.34%, 5.75%, 8.78%, and correlation coefficients 0.88, 0.96, and 0.95 respectively. The temporal matching of leaves was accurate in 95% of the cases and the average execution time required to analyse a plant over four time-points was 4.9 minutes. The mesh processing based methodology is thus considered suitable for quantitative 4D monitoring of plant phenotypic features. PMID:22553969
USDA-ARS?s Scientific Manuscript database
The Hazard Analysis and Critical Control Point (HACCP) food safety inspection program is utilized by both USDA Food Safety Inspection Service (FSIS) and FDA for many of the products they regulate. This science-based program was implemented by the USDA FSIS to enhance the food safety of meat and pou...
Level set method for image segmentation based on moment competition
NASA Astrophysics Data System (ADS)
Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai
2015-05-01
We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.
Estimation of distributed Fermat-point location for wireless sensor networking.
Huang, Po-Hsian; Chen, Jiann-Liang; Larosa, Yanuarius Teofilus; Chiang, Tsui-Lien
2011-01-01
This work presents a localization scheme for use in wireless sensor networks (WSNs) that is based on a proposed connectivity-based RF localization strategy called the distributed Fermat-point location estimation algorithm (DFPLE). DFPLE applies triangle area of location estimation formed by intersections of three neighboring beacon nodes. The Fermat point is determined as the shortest path from three vertices of the triangle. The area of estimated location then refined using Fermat point to achieve minimum error in estimating sensor nodes location. DFPLE solves problems of large errors and poor performance encountered by localization schemes that are based on a bounding box algorithm. Performance analysis of a 200-node development environment reveals that, when the number of sensor nodes is below 150, the mean error decreases rapidly as the node density increases, and when the number of sensor nodes exceeds 170, the mean error remains below 1% as the node density increases. Second, when the number of beacon nodes is less than 60, normal nodes lack sufficient beacon nodes to enable their locations to be estimated. However, the mean error changes slightly as the number of beacon nodes increases above 60. Simulation results revealed that the proposed algorithm for estimating sensor positions is more accurate than existing algorithms, and improves upon conventional bounding box strategies.
Configuration Analysis of the ERS Points in Large-Volume Metrology System
Jin, Zhangjun; Yu, Cijun; Li, Jiangxiong; Ke, Yinglin
2015-01-01
In aircraft assembly, multiple laser trackers are used simultaneously to measure large-scale aircraft components. To combine the independent measurements, the transformation matrices between the laser trackers’ coordinate systems and the assembly coordinate system are calculated, by measuring the enhanced referring system (ERS) points. This article aims to understand the influence of the configuration of the ERS points that affect the transformation matrix errors, and then optimize the deployment of the ERS points to reduce the transformation matrix errors. To optimize the deployment of the ERS points, an explicit model is derived to estimate the transformation matrix errors. The estimation model is verified by the experiment implemented in the factory floor. Based on the proposed model, a group of sensitivity coefficients are derived to evaluate the quality of the configuration of the ERS points, and then several typical configurations of the ERS points are analyzed in detail with the sensitivity coefficients. Finally general guidance is established to instruct the deployment of the ERS points in the aspects of the layout, the volume size and the number of the ERS points, as well as the position and orientation of the assembly coordinate system. PMID:26402685
Cheng, Xianfu; Lin, Yuqun
2014-01-01
The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.
Automatic comic page image understanding based on edge segment analysis
NASA Astrophysics Data System (ADS)
Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai
2013-12-01
Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.
Arigovindan, Muthuvel; Shaevitz, Joshua; McGowan, John; Sedat, John W; Agard, David A
2010-03-29
We address the problem of computational representation of image formation in 3D widefield fluorescence microscopy with depth varying spherical aberrations. We first represent 3D depth-dependent point spread functions (PSFs) as a weighted sum of basis functions that are obtained by principal component analysis (PCA) of experimental data. This representation is then used to derive an approximating structure that compactly expresses the depth variant response as a sum of few depth invariant convolutions pre-multiplied by a set of 1D depth functions, where the convolving functions are the PCA-derived basis functions. The model offers an efficient and convenient trade-off between complexity and accuracy. For a given number of approximating PSFs, the proposed method results in a much better accuracy than the strata based approximation scheme that is currently used in the literature. In addition to yielding better accuracy, the proposed methods automatically eliminate the noise in the measured PSFs.
NASA Astrophysics Data System (ADS)
Koltsov, A. G.; Shamutdinov, A. H.; Blokhin, D. A.; Krivonos, E. V.
2018-01-01
A new classification of parallel kinematics mechanisms on symmetry coefficient, being proportional to mechanism stiffness and accuracy of the processing product using the technological equipment under study, is proposed. A new version of the Stewart platform with a high symmetry coefficient is presented for analysis. The workspace of the mechanism under study is described, this space being a complex solid figure. The workspace end points are reached by the center of the mobile platform which moves in parallel related to the base plate. Parameters affecting the processing accuracy, namely the static and dynamic stiffness, natural vibration frequencies are determined. The capability assessment of the mechanism operation under various loads, taking into account resonance phenomena at different points of the workspace, was conducted. The study proved that stiffness and therefore, processing accuracy with the use of the above mentioned mechanisms are comparable with the stiffness and accuracy of medium-sized series-produced machines.
Hu, Yipeng; Morgan, Dominic; Ahmed, Hashim Uddin; Pendsé, Doug; Sahu, Mahua; Allen, Clare; Emberton, Mark; Hawkes, David; Barratt, Dean
2008-01-01
A method is described for generating a patient-specific, statistical motion model (SMM) of the prostate gland. Finite element analysis (FEA) is used to simulate the motion of the gland using an ultrasound-based 3D FE model over a range of plausible boundary conditions and soft-tissue properties. By applying principal component analysis to the displacements of the FE mesh node points inside the gland, the simulated deformations are then used as training data to construct the SMM. The SMM is used to both predict the displacement field over the whole gland and constrain a deformable surface registration algorithm, given only a small number of target points on the surface of the deformed gland. Using 3D transrectal ultrasound images of the prostates of five patients, acquired before and after imposing a physical deformation, to evaluate the accuracy of predicted landmark displacements, the mean target registration error was found to be less than 1.9 mm.
The QACITS pointing sensor: from theory to on-sky operation on Keck/NIRC2
NASA Astrophysics Data System (ADS)
Huby, Elsa; Absil, Olivier; Mawet, Dimitri; Baudoz, Pierre; Femenıa Castellã, Bruno; Bottom, Michael; Ngo, Henry; Serabyn, Eugene
2016-07-01
Small inner working angle coronagraphs are essential to benefit from the full potential of large and future extremely large ground-based telescopes, especially in the context of the detection and characterization of exoplanets. Among existing solutions, the vortex coronagraph stands as one of the most effective and promising solutions. However, for focal-plane coronagraph, a small inner working angle comes necessarily at the cost of a high sensitivity to pointing errors. This is the reason why a pointing control system is imperative to stabilize the star on the vortex center against pointing drifts due to mechanical flexures, that generally occur during observation due for instance to temperature and/or gravity variations. We have therefore developed a technique called QACITS1 (Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing), which is based on the analysis of the coronagraphic image shape to infer the amount of pointing error. It has been shown that the flux gradient in the image is directly related to the amount of tip-tilt affecting the beam. The main advantage of this technique is that it does not require any additional setup and can thus be easily implemented on all current facilities equipped with a vortex phase mask. In this paper, we focus on the implementation of the QACITS sensor at Keck/NIRC2, where an L-band AGPM has been recently commissioned (June and October 2015), successfully validating the QACITS estimator in the case of a centrally obstructed pupil. The algorithm has been designed to be easily handled by any user observing in vortex mode, which is available for science in shared risk mode since 2016B.
Ramírez-Vélez, R; Correa-Bautista, J E; Martínez-Torres, J; Méneses-Echavez, J F; González-Ruiz, K; González-Jiménez, E; Schmidt-RioValle, J; Lobelo, F
2016-01-01
Background/Objectives: Indices predictive of central obesity include waist circumference (WC) and waist-to-height ratio (WHtR). These data are lacking for Colombian adults. This study aims at establishing smoothed centile charts and LMS tables for WC and WHtR; appropriate cutoffs were selected using receiver-operating characteristic analysis based on data from the representative sample. Subjects/Methods: We used data from the cross-sectional, national representative nutrition survey (ENSIN, 2010). A total of 83 220 participants (aged 20–64) were enroled. Weight, height, body mass index (BMI), WC and WHtR were measured and percentiles calculated using the LMS method (L (curve Box-Cox), M (curve median), and S (curve coefficient of variation)). Receiver operating characteristics curve analyses were used to evaluate the optimal cutoff point of WC and WHtR for overweight and obesity based on WHO definitions. Results: Reference values for WC and WHtR are presented. Mean WC and WHtR increased with age for both genders. We found a strong positive correlation between WC and BMI (r=0.847, P< 0.01) and WHtR and BMI (r=0.878, P<0.01). In obese men, the cutoff point value is 96.6 cm for the WC. In women, the cutoff point value is 91.0 cm for the WC. Receiver operating characteristic curve for WHtR was also obtained and the cutoff point value of 0.579 in men, and in women the cutoff point value was 0.587. A high sensitivity and specificity were obtained. Conclusions: This study presents first reference values of WC and WHtR for Colombians aged 20–64. Through LMS tables for adults, we hope to provide quantitative tools to study obesity and its complications. PMID:27026425
NASA Astrophysics Data System (ADS)
Li, Ying-jun; Ai, Chang-sheng; Men, Xiu-hua; Zhang, Cheng-liang; Zhang, Qi
2013-04-01
This paper presents a novel on-line monitoring technology to obtain forming quality in steel ball's forming process based on load signal analysis method, in order to reveal the bottom die's load characteristic in initial cold heading forging process of steel balls. A mechanical model of the cold header producing process is established and analyzed by using finite element method. The maximum cold heading force is calculated. The results prove that the monitoring on the cold heading process with upsetting force is reasonable and feasible. The forming defects are inflected on the three feature points of the bottom die signals, which are the initial point, infection point, and peak point. A novel PVDF piezoelectric force sensor which is simple on construction and convenient on installation is designed. The sensitivity of the PVDF force sensor is calculated. The characteristics of PVDF force sensor are analyzed by FEM. The PVDF piezoelectric force sensor is fabricated to acquire the actual load signals in the cold heading process, and calibrated by a special device. The measuring system of on-line monitoring is built. The characteristics of the actual signals recognized by learning and identification algorithm are in consistence with simulation results. Identification of actual signals shows that the timing difference values of all feature points for qualified products are not exceed ±6 ms, and amplitude difference values are less than ±3%. The calibration and application experiments show that PVDF force sensor has good static and dynamic performances, and is competent at dynamic measuring on upsetting force. It greatly improves automatic level and machining precision. Equipment capacity factor with damages identification method depends on grade of steel has been improved to 90%.
Bayesian Inference for Functional Dynamics Exploring in fMRI Data.
Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing
2016-01-01
This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.
Security analysis of boolean algebra based on Zhang-Wang digital signature scheme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Jinbin, E-mail: jbzheng518@163.com
2014-10-06
In 2005, Zhang and Wang proposed an improvement signature scheme without using one-way hash function and message redundancy. In this paper, we show that this scheme exits potential safety concerns through the analysis of boolean algebra, such as bitwise exclusive-or, and point out that mapping is not one to one between assembly instructions and machine code actually by means of the analysis of the result of the assembly program segment, and which possibly causes safety problems unknown to the software.
Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis
Zhang, Ting; Chen, Juan; Jia, Xiaofeng
2015-01-01
Background This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Methodology/Principal Findings Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006–2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006–2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including “natural products and polymers” with nine key technical points, “fermentation industry” with twelve ones, “electrical medical equipment” with four ones, and “diagnosis, surgery” with four ones. Conclusions/Significance The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new technological opportunities. PMID:26599967
Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.
Zhang, Ting; Chen, Juan; Jia, Xiaofeng
2015-01-01
This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new technological opportunities.
Advanced analysis of forest fire clustering
NASA Astrophysics Data System (ADS)
Kanevski, Mikhail; Pereira, Mario; Golay, Jean
2017-04-01
Analysis of point pattern clustering is an important topic in spatial statistics and for many applications: biodiversity, epidemiology, natural hazards, geomarketing, etc. There are several fundamental approaches used to quantify spatial data clustering using topological, statistical and fractal measures. In the present research, the recently introduced multi-point Morisita index (mMI) is applied to study the spatial clustering of forest fires in Portugal. The data set consists of more than 30000 fire events covering the time period from 1975 to 2013. The distribution of forest fires is very complex and highly variable in space. mMI is a multi-point extension of the classical two-point Morisita index. In essence, mMI is estimated by covering the region under study by a grid and by computing how many times more likely it is that m points selected at random will be from the same grid cell than it would be in the case of a complete random Poisson process. By changing the number of grid cells (size of the grid cells), mMI characterizes the scaling properties of spatial clustering. From mMI, the data intrinsic dimension (fractal dimension) of the point distribution can be estimated as well. In this study, the mMI of forest fires is compared with the mMI of random patterns (RPs) generated within the validity domain defined as the forest area of Portugal. It turns out that the forest fires are highly clustered inside the validity domain in comparison with the RPs. Moreover, they demonstrate different scaling properties at different spatial scales. The results obtained from the mMI analysis are also compared with those of fractal measures of clustering - box counting and sand box counting approaches. REFERENCES Golay J., Kanevski M., Vega Orozco C., Leuenberger M., 2014: The multipoint Morisita index for the analysis of spatial patterns. Physica A, 406, 191-202. Golay J., Kanevski M. 2015: A new estimator of intrinsic dimension based on the multipoint Morisita index. Pattern Recognition, 48, 4070-4081.
Detecting corpus callosum abnormalities in autism based on anatomical landmarks
He, Qing; Duan, Ye; Karsch, Kevin; Miles, Judith
2010-01-01
Autism is a severe developmental disorder whose neurological basis is largely unknown. Autism is a subtype of autism that displays more homogeneous features within group. The aim of this study was to identify the shape differences of the corpus callosum between patients with autism and the controls. Anatomical landmarks were collected from mid-sagittal MRI of 25 patients and 18 controls. Euclidean distance matrix analysis and thin-plate spline were used to analyze the landmark forms. Point-by-point shape comparison was performed both globally and locally. A new local shape comparison scheme was proposed which compared each part of the shape in its local coordinate system. Point correspondence was established among individual shapes based on the inherent landmark correspondence. No significant difference was found in the landmark form between patients and controls, but the distance between interior genu and posterior most was found significantly shorter in patients. Thin-plate spline analysis showed significant group difference between the landmark configurations in terms of the deformation from the overall mean configuration. Significant global shape differences were found in the anterior lower body and posterior bottom, and local shape difference existed in the anterior bottom. This study can serve as both clinical reference and a detailed procedure guideline for similar studies in the future. PMID:20620032