Sample records for point process analysis

  1. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  2. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    PubMed

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  3. Modeling fixation locations using spatial point processes.

    PubMed

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  4. An analysis of neural receptive field plasticity by point process adaptive filtering

    PubMed Central

    Brown, Emery N.; Nguyen, David P.; Frank, Loren M.; Wilson, Matthew A.; Solo, Victor

    2001-01-01

    Neural receptive fields are plastic: with experience, neurons in many brain regions change their spiking responses to relevant stimuli. Analysis of receptive field plasticity from experimental measurements is crucial for understanding how neural systems adapt their representations of relevant biological information. Current analysis methods using histogram estimates of spike rate functions in nonoverlapping temporal windows do not track the evolution of receptive field plasticity on a fine time scale. Adaptive signal processing is an established engineering paradigm for estimating time-varying system parameters from experimental measurements. We present an adaptive filter algorithm for tracking neural receptive field plasticity based on point process models of spike train activity. We derive an instantaneous steepest descent algorithm by using as the criterion function the instantaneous log likelihood of a point process spike train model. We apply the point process adaptive filter algorithm in a study of spatial (place) receptive field properties of simulated and actual spike train data from rat CA1 hippocampal neurons. A stability analysis of the algorithm is sketched in the Appendix. The adaptive algorithm can update the place field parameter estimates on a millisecond time scale. It reliably tracked the migration, changes in scale, and changes in maximum firing rate characteristic of hippocampal place fields in a rat running on a linear track. Point process adaptive filtering offers an analytic method for studying the dynamics of neural receptive fields. PMID:11593043

  5. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  6. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  7. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  8. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  9. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  10. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  11. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  12. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  13. Performance analysis of a dual-tree algorithm for computing spatial distance histograms

    PubMed Central

    Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni

    2011-01-01

    Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753

  14. Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio

    2017-03-01

    It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.

  15. Non-hoop winding effect on bonding temperature of laser assisted tape winding process

    NASA Astrophysics Data System (ADS)

    Zaami, Amin; Baran, Ismet; Akkerman, Remko

    2018-05-01

    One of the advanced methods for production of thermoplastic composite methods is laser assisted tape winding (LATW). Predicting the temperature in LATW process is very important since the temperature at nip-point (bonding line through width) plays a pivotal role in a proper bonding and hence the mechanical performance. Despite the hoop-winding where the nip-point is the straight line, non-hoop winding includes a curved nip-point line. Hence, the non-hoop winding causes somewhat a different power input through laser-rays and-reflections and consequently generates unknown complex temperature profile on the curved nip-point line. Investigating the temperature at the nip-point line is the point of interest in this study. In order to understand this effect, a numerical model is proposed to capture the effect of laser-rays and their reflections on the nip-point temperature. To this end, a 3D optical model considering the objects in LATW process is considered. Then, the power distribution (absorption and reflection) from the optical analysis is used as an input (heat flux distribution) for the thermal analysis. The thermal analysis employs a fully-implicit advection-diffusion model to calculate the temperature on the surfaces. The results are examined to demonstrate the effect of winding direction on the curved nip-point line (tape width) which has not been considered in literature up to now. Furthermore, the results can be used for designing a better and more efficient setup in the LATW process.

  16. Determining the Number of Clusters in a Data Set Without Graphical Interpretation

    NASA Technical Reports Server (NTRS)

    Aguirre, Nathan S.; Davies, Misty D.

    2011-01-01

    Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,

  17. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points

    PubMed Central

    ACHCAR, J. A.; MARTINEZ, E. Z.; RUFFINO-NETTO, A.; PAULINO, C. D.; SOARES, P.

    2008-01-01

    SUMMARY We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software. PMID:18346287

  18. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    NASA Astrophysics Data System (ADS)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  19. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  20. What's the Point of a Raster ? Advantages of 3D Point Cloud Processing over Raster Based Methods for Accurate Geomorphic Analysis of High Resolution Topography.

    NASA Astrophysics Data System (ADS)

    Lague, D.

    2014-12-01

    High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.

  1. Multiobjective Sensitivity Analysis Of Sediment And Nitrogen Processes With A Watershed Model

    EPA Science Inventory

    This paper presents a computational analysis for evaluating critical non-point-source sediment and nutrient (specifically nitrogen) processes and management actions at the watershed scale. In the analysis, model parameters that bear key uncertainties were presumed to reflect the ...

  2. Estimating animal resource selection from telemetry data using point process models

    USGS Publications Warehouse

    Johnson, Devin S.; Hooten, Mevin B.; Kuhn, Carey E.

    2013-01-01

    To demonstrate the analysis of telemetry data with the point process approach, we analysed a data set of telemetry locations from northern fur seals (Callorhinus ursinus) in the Pribilof Islands, Alaska. Both a space–time and an aggregated space-only model were fitted. At the individual level, the space–time analysis showed little selection relative to the habitat covariates. However, at the study area level, the space-only model showed strong selection relative to the covariates.

  3. Method for cold stable biojet fuel

    DOEpatents

    Seames, Wayne S.; Aulich, Ted

    2015-12-08

    Plant or animal oils are processed to produce a fuel that operates at very cold temperatures and is suitable as an aviation turbine fuel, a diesel fuel, a fuel blendstock, or any fuel having a low cloud point, pour point or freeze point. The process is based on the cracking of plant or animal oils or their associated esters, known as biodiesel, to generate lighter chemical compounds that have substantially lower cloud, pour, and/or freeze points than the original oil or biodiesel. Cracked oil is processed using separation steps together with analysis to collect fractions with desired low temperature properties by removing undesirable compounds that do not possess the desired temperature properties.

  4. Monitoring urban subsidence based on SAR lnterferometric point target analysis

    USGS Publications Warehouse

    Zhang, Y.; Zhang, Jiahua; Gong, W.; Lu, Z.

    2009-01-01

    lnterferometric point target analysis (IPTA) is one of the latest developments in radar interferometric processing. It is achieved by analysis of the interferometric phases of some individual point targets, which are discrete and present temporarily stable backscattering characteristics, in long temporal series of interferometric SAR images. This paper analyzes the interferometric phase model of point targets, and then addresses two key issues within IPTA process. Firstly, a spatial searching method is proposed to unwrap the interferometric phase difference between two neighboring point targets. The height residual error and linear deformation rate of each point target can then be calculated, when a global reference point with known height correction and deformation history is chosen. Secondly, a spatial-temporal filtering scheme is proposed to further separate the atmosphere phase and nonlinear deformation phase from the residual interferometric phase. Finally, an experiment of the developed IPTA methodology is conducted over Suzhou urban area. Totally 38 ERS-1/2 SAR scenes are analyzed, and the deformation information over 3 546 point targets in the time span of 1992-2002 are generated. The IPTA-derived deformation shows very good agreement with the published result, which demonstrates that the IPTA technique can be developed into an operational tool to map the ground subsidence over urban area.

  5. Steelmaking process control using remote ultraviolet atomic emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Arnold, Samuel

    Steelmaking in North America is a multi-billion dollar industry that has faced tremendous economic and environmental pressure over the past few decades. Fierce competition has driven steel manufacturers to improve process efficiency through the development of real-time sensors to reduce operating costs. In particular, much attention has been focused on end point detection through furnace off gas analysis. Typically, off-gas analysis is done with extractive sampling and gas analyzers such as Non-dispersive Infrared Sensors (NDIR). Passive emission spectroscopy offers a more attractive approach to end point detection as the equipment can be setup remotely. Using high resolution UV spectroscopy and applying sophisticated emission line detection software, a correlation was observed between metal emissions and the process end point during field trials. This correlation indicates a relationship between the metal emissions and the status of a steelmaking melt which can be used to improve overall process efficiency.

  6. 75 FR 14361 - Notification, Documentation, and Recordkeeping Requirements for Inspected Establishments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...

  7. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  8. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  9. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  10. Quantitative naturalistic methods for detecting change points in psychotherapy research: an illustration with alliance ruptures.

    PubMed

    Eubanks-Carter, Catherine; Gorman, Bernard S; Muran, J Christopher

    2012-01-01

    Analysis of change points in psychotherapy process could increase our understanding of mechanisms of change. In particular, naturalistic change point detection methods that identify turning points or breakpoints in time series data could enhance our ability to identify and study alliance ruptures and resolutions. This paper presents four categories of statistical methods for detecting change points in psychotherapy process: criterion-based methods, control chart methods, partitioning methods, and regression methods. Each method's utility for identifying shifts in the alliance is illustrated using a case example from the Beth Israel Psychotherapy Research program. Advantages and disadvantages of the various methods are discussed.

  11. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  12. Public Participation Process for Registration Actions

    EPA Pesticide Factsheets

    Describes the process for registration actions which provides the opportunity for the public to comment on major registration decisions at a point in the registration process when comprehensive information and analysis are available.

  13. State Analysis: A Control Architecture View of Systems Engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D.

    2005-01-01

    A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.

  14. Conducting a Competitive Prototype Acquisition Program: An Account of the Joint Light Tactical Vehicle (JLTV) Technology Development Phase

    DTIC Science & Technology

    2013-03-01

    9  B.  REQUIREMENTS ANALYSIS PROCESS ..................................................9  1.  Requirements Management and... Analysis Plan ................................9  2.  Knowledge Point Reviews .................................................................11  3...are Identified .......12  5.  RMAP/CDD Process Analysis and Results......................................13  IV.  TD PHASE BEGINS

  15. Effect of processing conditions on oil point pressure of moringa oleifera seed.

    PubMed

    Aviara, N A; Musa, W B; Owolarafe, O K; Ogunsina, B S; Oluwole, F A

    2015-07-01

    Seed oil expression is an important economic venture in rural Nigeria. The traditional techniques of carrying out the operation is not only energy sapping and time consuming but also wasteful. In order to reduce the tedium involved in the expression of oil from moringa oleifera seed and develop efficient equipment for carrying out the operation, the oil point pressure of the seed was determined under different processing conditions using a laboratory press. The processing conditions employed were moisture content (4.78, 6.00, 8.00 and 10.00 % wet basis), heating temperature (50, 70, 85 and 100 °C) and heating time (15, 20, 25 and 30 min). Results showed that the oil point pressure increased with increase in seed moisture content, but decreased with increase in heating temperature and heating time within the above ranges. Highest oil point pressure value of 1.1239 MPa was obtained at the processing conditions of 10.00 % moisture content, 50 °C heating temperature and 15 min heating time. The lowest oil point pressure obtained was 0.3164 MPa and it occurred at the moisture content of 4.78 %, heating temperature of 100 °C and heating time of 30 min. Analysis of Variance (ANOVA) showed that all the processing variables and their interactions had significant effect on the oil point pressure of moringa oleifera seed at 1 % level of significance. This was further demonstrated using Response Surface Methodology (RSM). Tukey's test and Duncan's Multiple Range Analysis successfully separated the means and a multiple regression equation was used to express the relationship existing between the oil point pressure of moringa oleifera seed and its moisture content, processing temperature, heating time and their interactions. The model yielded coefficients that enabled the oil point pressure of the seed to be predicted with very high coefficient of determination.

  16. Error analysis in stereo vision for location measurement of 3D point

    NASA Astrophysics Data System (ADS)

    Li, Yunting; Zhang, Jun; Tian, Jinwen

    2015-12-01

    Location measurement of 3D point in stereo vision is subjected to different sources of uncertainty that propagate to the final result. For current methods of error analysis, most of them are based on ideal intersection model to calculate the uncertainty region of point location via intersecting two fields of view of pixel that may produce loose bounds. Besides, only a few of sources of error such as pixel error or camera position are taken into account in the process of analysis. In this paper we present a straightforward and available method to estimate the location error that is taken most of source of error into account. We summed up and simplified all the input errors to five parameters by rotation transformation. Then we use the fast algorithm of midpoint method to deduce the mathematical relationships between target point and the parameters. Thus, the expectations and covariance matrix of 3D point location would be obtained, which can constitute the uncertainty region of point location. Afterwards, we turned back to the error propagation of the primitive input errors in the stereo system and throughout the whole analysis process from primitive input errors to localization error. Our method has the same level of computational complexity as the state-of-the-art method. Finally, extensive experiments are performed to verify the performance of our methods.

  17. Analysis of backward error recovery for concurrent processes with recovery blocks

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y. H.

    1982-01-01

    Three different methods of implementing recovery blocks (RB's). These are the asynchronous, synchronous, and the pseudo recovery point implementations. Pseudo recovery points so that unbounded rollback may be avoided while maintaining process autonomy are proposed. Probabilistic models for analyzing these three methods under standard assumptions in computer performance analysis, i.e., exponential distributions for related random variables were developed. The interval between two successive recovery lines for asynchronous RB's mean loss in computation power for the synchronized method, and additional overhead and rollback distance in case PRP's are used were estimated.

  18. A COMPARISON OF TRANSIENT INFINITE ELEMENTS AND TRANSIENT KIRCHHOFF INTEGRAL METHODS FOR FAR FIELD ACOUSTIC ANALYSIS

    DOE PAGES

    WALSH, TIMOTHY F.; JONES, ANDREA; BHARDWAJ, MANOJ; ...

    2013-04-01

    Finite element analysis of transient acoustic phenomena on unbounded exterior domains is very common in engineering analysis. In these problems there is a common need to compute the acoustic pressure at points outside of the acoustic mesh, since meshing to points of interest is impractical in many scenarios. In aeroacoustic calculations, for example, the acoustic pressure may be required at tens or hundreds of meters from the structure. In these cases, a method is needed for post-processing the acoustic results to compute the response at far-field points. In this paper, we compare two methods for computing far-field acoustic pressures, onemore » derived directly from the infinite element solution, and the other from the transient version of the Kirchhoff integral. Here, we show that the infinite element approach alleviates the large storage requirements that are typical of Kirchhoff integral and related procedures, and also does not suffer from loss of accuracy that is an inherent part of computing numerical derivatives in the Kirchhoff integral. In order to further speed up and streamline the process of computing the acoustic response at points outside of the mesh, we also address the nonlinear iterative procedure needed for locating parametric coordinates within the host infinite element of far-field points, the parallelization of the overall process, linear solver requirements, and system stability considerations.« less

  19. Points of View Analysis Revisited: Fitting Multidimensional Structures to Optimal Distance Components with Cluster Restrictions on the Variables.

    ERIC Educational Resources Information Center

    Meulman, Jacqueline J.; Verboon, Peter

    1993-01-01

    Points of view analysis, as a way to deal with individual differences in multidimensional scaling, was largely supplanted by the weighted Euclidean model. It is argued that the approach deserves new attention, especially as a technique to analyze group differences. A streamlined and integrated process is proposed. (SLD)

  20. Investigation Of In-Line Monitoring Options At H Canyon/HB Line For Plutonium Oxide Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sexton, L.

    2015-10-14

    H Canyon and HB Line have a production goal of 1 MT per year of plutonium oxide feedstock for the MOX facility by FY17 (AFS-2 mission). In order to meet this goal, steps will need to be taken to improve processing efficiency. One concept for achieving this goal is to implement in-line process monitoring at key measurement points within the facilities. In-line monitoring during operations has the potential to increase throughput and efficiency while reducing costs associated with laboratory sample analysis. In the work reported here, we mapped the plutonium oxide process, identified key measurement points, investigated alternate technologies thatmore » could be used for in-line analysis, and initiated a throughput benefit analysis.« less

  1. Safety Analysis of Soybean Processing for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  2. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    PubMed

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  3. Comparative study of building footprint estimation methods from LiDAR point clouds

    NASA Astrophysics Data System (ADS)

    Rozas, E.; Rivera, F. F.; Cabaleiro, J. C.; Pena, T. F.; Vilariño, D. L.

    2017-10-01

    Building area calculation from LiDAR points is still a difficult task with no clear solution. Their different characteristics, such as shape or size, have made the process too complex to automate. However, several algorithms and techniques have been used in order to obtain an approximated hull. 3D-building reconstruction or urban planning are examples of important applications that benefit of accurate building footprint estimations. In this paper, we have carried out a study of accuracy in the estimation of the footprint of buildings from LiDAR points. The analysis focuses on the processing steps following the object recognition and classification, assuming that labeling of building points have been previously performed. Then, we perform an in-depth analysis of the influence of the point density over the accuracy of the building area estimation. In addition, a set of buildings with different size and shape were manually classified, in such a way that they can be used as benchmark.

  4. "From that moment on my life changed": turning points in the healing process for men recovering from child sexual abuse.

    PubMed

    Easton, Scott D; Leone-Sheehan, Danielle M; Sophis, Ellen J; Willis, Danny G

    2015-01-01

    Recent research indicates that child sexual abuse often undermines the health of boys and men across the lifespan. However, some male survivors experience a turning point marking a positive change in their health trajectories and healing process. Although frequently discussed in reference to physical health problems or addictions, very little is known about turning points with respect to child sexual abuse for men. The purpose of this secondary qualitative analysis was to describe the different types of turning points experienced by male survivors who completed the 2010 Health and Well-Being Survey (N = 250). Using conventional content analysis, researchers identified seven types of turning points that were classified into three broad categories: influential relationships (professional and group support, personal relationships), insights and new meanings (cognitive realizations, necessity to change, spiritual transformation), and action-oriented communication (disclosure of CSA, pursuit of justice). Implications for clinical practice and future research are discussed.

  5. JPL-ANTOPT antenna structure optimization program

    NASA Technical Reports Server (NTRS)

    Strain, D. M.

    1994-01-01

    New antenna path-length error and pointing-error structure optimization codes were recently added to the MSC/NASTRAN structural analysis computer program. Path-length and pointing errors are important measured of structure-related antenna performance. The path-length and pointing errors are treated as scalar displacements for statics loading cases. These scalar displacements can be subject to constraint during the optimization process. Path-length and pointing-error calculations supplement the other optimization and sensitivity capabilities of NASTRAN. The analysis and design functions were implemented as 'DMAP ALTERs' to the Design Optimization (SOL 200) Solution Sequence of MSC-NASTRAN, Version 67.5.

  6. [Risk management--a new aspect of quality assessment in intensive care medicine: first results of an analysis of the DIVI's interdisciplinary quality assessment research group].

    PubMed

    Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch

    2006-10-01

    Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.

  7. Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis

    NASA Astrophysics Data System (ADS)

    Che, E.; Olsen, M. J.

    2017-09-01

    Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.

  8. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  9. Teacher training of secondary - Orient from the point of view practice

    NASA Astrophysics Data System (ADS)

    Hai, Tuong Duy; Huong, Nguyen Thanh

    2018-01-01

    The article presents the point of view about teacher training based on analysis of practices of teaching/learning in disciplinary in high school. Basing on analysis results of teaching faculty and the learning process of students in the disciplinary in high school to offer tags referred to the ongoing training of secondary teachers to adapt to educational.

  10. Relationship between quality improvement processes and clinical performance.

    PubMed

    Damberg, Cheryl L; Shortell, Stephen M; Raube, Kristiana; Gillies, Robin R; Rittenhouse, Diane; McCurdy, Rodney K; Casalino, Lawrence P; Adams, John

    2010-08-01

    To examine the association between performance on clinical process measures and intermediate outcomes and the use of chronic care management processes (CMPs), electronic medical record (EMR) capabilities, and participation in external quality improvement (QI) initiatives. Cross-sectional analysis of linked 2006 clinical performance scores from the Integrated Healthcare Association's pay-for-performance program and survey data from the 2nd National Study of Physician Organizations among 108 California physician organizations (POs). Controlling for differences in PO size, organization type (medical group or independent practice association), and Medicaid revenue, we used ordinary least squares regression analysis to examine the association between the use of CMPs, EMR capabilities, and external QI initiatives and performance on the following 3 clinical composite measures: diabetes management, processes of care, and intermediate outcomes (diabetes and cardiovascular). Greater use of CMPs was significantly associated with clinical performance: among POs using more than 5 CMPs, we observed a 3.2-point higher diabetes management score on a performance scale with scores ranging from 0 to 100 (P <.001), while for each 1.0-point increase on the CMP index, we observed a 1.0-point gain in intermediate outcomes (P <.001). Participation in external QI initiatives was positively associated with improved delivery of clinical processes of care: a 1.0-point increase on the QI index translated into a 1.4-point gain in processes-of-care performance (P = .02). No relationship was observed between EMR capabilities and performance. Greater investments in CMPs and QI interventions may help POs raise clinical performance and achieve success under performance-based accountability schemes.

  11. An information-based approach to change-point analysis with applications to biophysics and cell biology.

    PubMed

    Wiggins, Paul A

    2015-07-21

    This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  12. Digital analyzer for point processes based on first-in-first-out memories

    NASA Astrophysics Data System (ADS)

    Basano, Lorenzo; Ottonello, Pasquale; Schiavi, Enore

    1992-06-01

    We present an entirely new version of a multipurpose instrument designed for the statistical analysis of point processes, especially those characterized by high bunching. A long sequence of pulses can be recorded in the RAM bank of a personal computer via a suitably designed front end which employs a pair of first-in-first-out (FIFO) memories; these allow one to build an analyzer that, besides being simpler from the electronic point of view, is capable of sustaining much higher intensity fluctuations of the point process. The overflow risk of the device is evaluated by treating the FIFO pair as a queueing system. The apparatus was tested using both a deterministic signal and a sequence of photoelectrons obtained from laser light scattered by random surfaces.

  13. Theories of State Analyzing the Policy Process,

    DTIC Science & Technology

    1973-11-01

    values and goals - which is the heart of the rational process-- in reality cannot be separated from the actor’s empirical analysis of the situation...rigorous and objective in analysis . How different would our foreign policy actually be? Would it necessarily be better? In fact, would one even need...State, but the fact is that much of the outside research and analysis of policy process is pointed at the 6 As Robert Rothstein says in his valuable

  14. Characterisation of titanium-titanium boride composites processed by powder metallurgy techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selva Kumar, M., E-mail: sel_mcet@yahoo.co.in; Chandrasekar, P.; Chandramohan, P.

    2012-11-15

    In this work, a detailed characterisation of titanium-titanium boride composites processed by three powder metallurgy techniques, namely, hot isostatic pressing, spark plasma sintering and vacuum sintering, was conducted. Two composites with different volume percents of titanium boride reinforcement were used for the investigation. One was titanium with 20% titanium boride, and the other was titanium with 40% titanium boride (by volume). Characterisation was performed using X-ray diffraction, electron probe micro analysis - energy dispersive spectroscopy and wavelength dispersive spectroscopy, image analysis and scanning electron microscopy. The characterisation results confirm the completion of the titanium boride reaction. The results reveal themore » presence of titanium boride reinforcement in different morphologies such as needle-shaped whiskers, short agglomerated whiskers and fine plates. The paper also discusses how mechanical properties such as microhardness, elastic modulus and Poisson's ratio are influenced by the processing techniques as well as the volume fraction of the titanium boride reinforcement. - Highlights: Black-Right-Pointing-Pointer Ti-TiB composites were processed by HIP, SPS and vacuum sintering. Black-Right-Pointing-Pointer The completion of Ti-TiB{sub 2} reaction was confirmed by XRD, SEM and EPMA studies. Black-Right-Pointing-Pointer Hardness and elastic properties of Ti-TiB composites were discussed. Black-Right-Pointing-Pointer Processing techniques were compared with respect to their microstructure.« less

  15. Data processing workflows from low-cost digital survey to various applications: three case studies of Chinese historic architecture

    NASA Astrophysics Data System (ADS)

    Sun, Z.; Cao, Y. K.

    2015-08-01

    The paper focuses on the versatility of data processing workflows ranging from BIM-based survey to structural analysis and reverse modeling. In China nowadays, a large number of historic architecture are in need of restoration, reinforcement and renovation. But the architects are not prepared for the conversion from the booming AEC industry to architectural preservation. As surveyors working with architects in such projects, we have to develop efficient low-cost digital survey workflow robust to various types of architecture, and to process the captured data for architects. Although laser scanning yields high accuracy in architectural heritage documentation and the workflow is quite straightforward, the cost and portability hinder it from being used in projects where budget and efficiency are of prime concern. We integrate Structure from Motion techniques with UAV and total station in data acquisition. The captured data is processed for various purposes illustrated with three case studies: the first one is as-built BIM for a historic building based on registered point clouds according to Ground Control Points; The second one concerns structural analysis for a damaged bridge using Finite Element Analysis software; The last one relates to parametric automated feature extraction from captured point clouds for reverse modeling and fabrication.

  16. Structure Line Detection from LIDAR Point Clouds Using Topological Elevation Analysis

    NASA Astrophysics Data System (ADS)

    Lo, C. Y.; Chen, L. C.

    2012-07-01

    Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA) to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  17. Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes

    NASA Astrophysics Data System (ADS)

    Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.

    2015-12-01

    Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.

  18. An analysis of the Petri net based model of the human body iron homeostasis process.

    PubMed

    Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek

    2007-02-01

    In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.

  19. Automatic generation of endocardial surface meshes with 1-to-1 correspondence from cine-MR images

    NASA Astrophysics Data System (ADS)

    Su, Yi; Teo, S.-K.; Lim, C. W.; Zhong, L.; Tan, R. S.

    2015-03-01

    In this work, we develop an automatic method to generate a set of 4D 1-to-1 corresponding surface meshes of the left ventricle (LV) endocardial surface which are motion registered over the whole cardiac cycle. These 4D meshes have 1- to-1 point correspondence over the entire set, and is suitable for advanced computational processing, such as shape analysis, motion analysis and finite element modelling. The inputs to the method are the set of 3D LV endocardial surface meshes of the different frames/phases of the cardiac cycle. Each of these meshes is reconstructed independently from border-delineated MR images and they have no correspondence in terms of number of vertices/points and mesh connectivity. To generate point correspondence, the first frame of the LV mesh model is used as a template to be matched to the shape of the meshes in the subsequent phases. There are two stages in the mesh correspondence process: (1) a coarse matching phase, and (2) a fine matching phase. In the coarse matching phase, an initial rough matching between the template and the target is achieved using a radial basis function (RBF) morphing process. The feature points on the template and target meshes are automatically identified using a 16-segment nomenclature of the LV. In the fine matching phase, a progressive mesh projection process is used to conform the rough estimate to fit the exact shape of the target. In addition, an optimization-based smoothing process is used to achieve superior mesh quality and continuous point motion.

  20. Research on on-line monitoring technology for steel ball's forming process based on load signal analysis method

    NASA Astrophysics Data System (ADS)

    Li, Ying-jun; Ai, Chang-sheng; Men, Xiu-hua; Zhang, Cheng-liang; Zhang, Qi

    2013-04-01

    This paper presents a novel on-line monitoring technology to obtain forming quality in steel ball's forming process based on load signal analysis method, in order to reveal the bottom die's load characteristic in initial cold heading forging process of steel balls. A mechanical model of the cold header producing process is established and analyzed by using finite element method. The maximum cold heading force is calculated. The results prove that the monitoring on the cold heading process with upsetting force is reasonable and feasible. The forming defects are inflected on the three feature points of the bottom die signals, which are the initial point, infection point, and peak point. A novel PVDF piezoelectric force sensor which is simple on construction and convenient on installation is designed. The sensitivity of the PVDF force sensor is calculated. The characteristics of PVDF force sensor are analyzed by FEM. The PVDF piezoelectric force sensor is fabricated to acquire the actual load signals in the cold heading process, and calibrated by a special device. The measuring system of on-line monitoring is built. The characteristics of the actual signals recognized by learning and identification algorithm are in consistence with simulation results. Identification of actual signals shows that the timing difference values of all feature points for qualified products are not exceed ±6 ms, and amplitude difference values are less than ±3%. The calibration and application experiments show that PVDF force sensor has good static and dynamic performances, and is competent at dynamic measuring on upsetting force. It greatly improves automatic level and machining precision. Equipment capacity factor with damages identification method depends on grade of steel has been improved to 90%.

  1. Genetic interaction analysis of point mutations enables interrogation of gene function at a residue-level resolution

    PubMed Central

    Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.

    2014-01-01

    We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270

  2. About plasma points' generation in Z-pinch

    NASA Astrophysics Data System (ADS)

    Afonin, V. I.; Potapov, A. V.; Lazarchuk, V. P.; Murugov, V. M.; Senik, A. V.

    1997-05-01

    The streak tube study results (at visible and x-ray ranges) of dynamics of fast Z-pinch formed at explosion of metal wire in diode of high current generator are presented. Amplitude of current in the load reached ˜180 kA at increase time ˜50 ns. The results' analysis points to capability of controlling hot plasma points generation process in Z-pinch.

  3. Calculation of the ELISA's cut-off based on the change-point analysis method for detection of Trypanosoma cruzi infection in Bolivian dogs in the absence of controls.

    PubMed

    Lardeux, Frédéric; Torrico, Gino; Aliaga, Claudia

    2016-07-04

    In ELISAs, sera of individuals infected by Trypanosoma cruzi show absorbance values above a cut-off value. The cut-off is generally computed by means of formulas that need absorbance readings of negative (and sometimes positive) controls, which are included in the titer plates amongst the unknown samples. When no controls are available, other techniques should be employed such as change-point analysis. The method was applied to Bolivian dog sera processed by ELISA to diagnose T. cruzi infection. In each titer plate, the change-point analysis estimated a step point which correctly discriminated among known positive and known negative sera, unlike some of the six usual cut-off formulas tested. To analyse the ELISAs results, the change-point method was as good as the usual cut-off formula of the form "mean + 3 standard deviation of negative controls". Change-point analysis is therefore an efficient alternative method to analyse ELISA absorbance values when no controls are available.

  4. Congruence analysis of point clouds from unstable stereo image sequences

    NASA Astrophysics Data System (ADS)

    Jepping, C.; Bethmann, F.; Luhmann, T.

    2014-06-01

    This paper deals with the correction of exterior orientation parameters of stereo image sequences over deformed free-form surfaces without control points. Such imaging situation can occur, for example, during photogrammetric car crash test recordings where onboard high-speed stereo cameras are used to measure 3D surfaces. As a result of such measurements 3D point clouds of deformed surfaces are generated for a complete stereo sequence. The first objective of this research focusses on the development and investigation of methods for the detection of corresponding spatial and temporal tie points within the stereo image sequences (by stereo image matching and 3D point tracking) that are robust enough for a reliable handling of occlusions and other disturbances that may occur. The second objective of this research is the analysis of object deformations in order to detect stable areas (congruence analysis). For this purpose a RANSAC-based method for congruence analysis has been developed. This process is based on the sequential transformation of randomly selected point groups from one epoch to another by using a 3D similarity transformation. The paper gives a detailed description of the congruence analysis. The approach has been tested successfully on synthetic and real image data.

  5. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    PubMed

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows: Risk (R) = probability (P) × damage (D). The NACCP process considers the entire food supply chain "from farm to consumer"; in each point of the chain it is necessary implement a tight monitoring in order to guarantee optimal nutritional quality.

  6. Delivering cognitive processing therapy in a community health setting: The influence of Latino culture and community violence on posttraumatic cognitions.

    PubMed

    Marques, Luana; Eustis, Elizabeth H; Dixon, Louise; Valentine, Sarah E; Borba, Christina P C; Simon, Naomi; Kaysen, Debra; Wiltsey-Stirman, Shannon

    2016-01-01

    Despite the applicability of cognitive processing therapy (CPT) for posttraumatic stress disorder (PTSD) to addressing sequelae of a range of traumatic events, few studies have evaluated whether the treatment itself is applicable across diverse populations. The present study examined differences and similarities among non-Latino, Latino Spanish-speaking, and Latino English-speaking clients in rigid beliefs-or "stuck points"-associated with PTSD symptoms in a sample of community mental health clients. We utilized the procedures of content analysis to analyze stuck point logs and impact statements of 29 participants enrolled in a larger implementation trial for CPT. Findings indicated that the content of stuck points was similar across Latino and non-Latino clients, although fewer total stuck points were identified for Latino clients compared to non-Latino clients. Given that identification of stuck points is central to implementing CPT, difficulty identifying stuck points could pose significant challenges for implementing CPT among Latino clients and warrants further examination. Thematic analysis of impact statements revealed the importance of family, religion, and the urban context (e.g., poverty, violence exposure) in understanding how clients organize beliefs and emotions associated with trauma. Clinical recommendations for implementing CPT in community settings and the identification of stuck points are provided. (c) 2016 APA, all rights reserved).

  7. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis.

    PubMed

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  8. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis

    NASA Astrophysics Data System (ADS)

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  9. The Shuttle processing contractors (SPC) reliability program at the Kennedy Space Center - The real world

    NASA Astrophysics Data System (ADS)

    McCrea, Terry

    The Shuttle Processing Contract (SPC) workforce consists of Lockheed Space Operations Co. as prime contractor, with Grumman, Thiokol Corporation, and Johnson Controls World Services as subcontractors. During the design phase, reliability engineering is instrumental in influencing the development of systems that meet the Shuttle fail-safe program requirements. Reliability engineers accomplish this objective by performing FMEA (failure modes and effects analysis) to identify potential single failure points. When technology, time, or resources do not permit a redesign to eliminate a single failure point, the single failure point information is formatted into a change request and presented to senior management of SPC and NASA for risk acceptance. In parallel with the FMEA, safety engineering conducts a hazard analysis to assure that potential hazards to personnel are assessed. The combined effort (FMEA and hazard analysis) is published as a system assurance analysis. Special ground rules and techniques are developed to perform and present the analysis. The reliability program at KSC is vigorously pursued, and has been extremely successful. The ground support equipment and facilities used to launch and land the Space Shuttle maintain an excellent reliability record.

  10. Linear and quadratic models of point process systems: contributions of patterned input to output.

    PubMed

    Lindsay, K A; Rosenberg, J R

    2012-08-01

    In the 1880's Volterra characterised a nonlinear system using a functional series connecting continuous input and continuous output. Norbert Wiener, in the 1940's, circumvented problems associated with the application of Volterra series to physical problems by deriving from it a new series of terms that are mutually uncorrelated with respect to Gaussian processes. Subsequently, Brillinger, in the 1970's, introduced a point-process analogue of Volterra's series connecting point-process inputs to the instantaneous rate of point-process output. We derive here a new series from this analogue in which its terms are mutually uncorrelated with respect to Poisson processes. This new series expresses how patterned input in a spike train, represented by third-order cross-cumulants, is converted into the instantaneous rate of an output point-process. Given experimental records of suitable duration, the contribution of arbitrary patterned input to an output process can, in principle, be determined. Solutions for linear and quadratic point-process models with one and two inputs and a single output are investigated. Our theoretical results are applied to isolated muscle spindle data in which the spike trains from the primary and secondary endings from the same muscle spindle are recorded in response to stimulation of one and then two static fusimotor axons in the absence and presence of a random length change imposed on the parent muscle. For a fixed mean rate of input spikes, the analysis of the experimental data makes explicit which patterns of two input spikes contribute to an output spike. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  12. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  13. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  14. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  15. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  16. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  17. Unsupervised Detection of Planetary Craters by a Marked Point Process

    NASA Technical Reports Server (NTRS)

    Troglio, G.; Benediktsson, J. A.; Le Moigne, J.; Moser, G.; Serpico, S. B.

    2011-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images is being acquired. Preferably, automatic and robust processing techniques need to be used for data analysis because of the huge amount of the acquired data. Here, the aim is to achieve a robust and general methodology for crater detection. A novel technique based on a marked point process is proposed. First, the contours in the image are extracted. The object boundaries are modeled as a configuration of an unknown number of random ellipses, i.e., the contour image is considered as a realization of a marked point process. Then, an energy function is defined, containing both an a priori energy and a likelihood term. The global minimum of this function is estimated by using reversible jump Monte-Carlo Markov chain dynamics and a simulated annealing scheme. The main idea behind marked point processes is to model objects within a stochastic framework: Marked point processes represent a very promising current approach in the stochastic image modeling and provide a powerful and methodologically rigorous framework to efficiently map and detect objects and structures in an image with an excellent robustness to noise. The proposed method for crater detection has several feasible applications. One such application area is image registration by matching the extracted features.

  18. Sample to answer visualization pipeline for low-cost point-of-care blood cell counting

    NASA Astrophysics Data System (ADS)

    Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter

    2015-03-01

    We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.

  19. Magnetic topological analysis of coronal bright points

    NASA Astrophysics Data System (ADS)

    Galsgaard, K.; Madjarska, M. S.; Moreno-Insertis, F.; Huang, Z.; Wiegelmann, T.

    2017-10-01

    Context. We report on the first of a series of studies on coronal bright points which investigate the physical mechanism that generates these phenomena. Aims: The aim of this paper is to understand the magnetic-field structure that hosts the bright points. Methods: We use longitudinal magnetograms taken by the Solar Optical Telescope with the Narrowband Filter Imager. For a single case, magnetograms from the Helioseismic and Magnetic Imager were added to the analysis. The longitudinal magnetic field component is used to derive the potential magnetic fields of the large regions around the bright points. A magneto-static field extrapolation method is tested to verify the accuracy of the potential field modelling. The three dimensional magnetic fields are investigated for the presence of magnetic null points and their influence on the local magnetic domain. Results: In nine out of ten cases the bright point resides in areas where the coronal magnetic field contains an opposite polarity intrusion defining a magnetic null point above it. We find that X-ray bright points reside, in these nine cases, in a limited part of the projected fan-dome area, either fully inside the dome or expanding over a limited area below which typically a dominant flux concentration resides. The tenth bright point is located in a bipolar loop system without an overlying null point. Conclusions: All bright points in coronal holes and two out of three bright points in quiet Sun regions are seen to reside in regions containing a magnetic null point. An as yet unidentified process(es) generates the brigh points in specific regions of the fan-dome structure. The movies are available at http://www.aanda.org

  20. Automation of the Image Analysis for Thermographic Inspection

    NASA Technical Reports Server (NTRS)

    Plotnikov, Yuri A.; Winfree, William P.

    1998-01-01

    Several data processing procedures for the pulse thermal inspection require preliminary determination of an unflawed region. Typically, an initial analysis of the thermal images is performed by an operator to determine the locations of unflawed and the defective areas. In the present work an algorithm is developed for automatically determining a reference point corresponding to an unflawed region. Results are obtained for defects which are arbitrarily located in the inspection region. A comparison is presented of the distributions of derived values with right and wrong localization of the reference point. Different algorithms of automatic determination of the reference point are compared.

  1. Point-point and point-line moving-window correlation spectroscopy and its applications

    NASA Astrophysics Data System (ADS)

    Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu

    2008-07-01

    In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.

  2. Global-to-local, shape-based, real and virtual landmarks for shape modeling by recursive boundary subdivision

    NASA Astrophysics Data System (ADS)

    Rueda, Sylvia; Udupa, Jayaram K.

    2011-03-01

    Landmark based statistical object modeling techniques, such as Active Shape Model (ASM), have proven useful in medical image analysis. Identification of the same homologous set of points in a training set of object shapes is the most crucial step in ASM, which has encountered challenges such as (C1) defining and characterizing landmarks; (C2) ensuring homology; (C3) generalizing to n > 2 dimensions; (C4) achieving practical computations. In this paper, we propose a novel global-to-local strategy that attempts to address C3 and C4 directly and works in Rn. The 2D version starts from two initial corresponding points determined in all training shapes via a method α, and subsequently by subdividing the shapes into connected boundary segments by a line determined by these points. A shape analysis method β is applied on each segment to determine a landmark on the segment. This point introduces more pairs of points, the lines defined by which are used to further subdivide the boundary segments. This recursive boundary subdivision (RBS) process continues simultaneously on all training shapes, maintaining synchrony of the level of recursion, and thereby keeping correspondence among generated points automatically by the correspondence of the homologous shape segments in all training shapes. The process terminates when no subdividing lines are left to be considered that indicate (as per method β) that a point can be selected on the associated segment. Examples of α and β are presented based on (a) distance; (b) Principal Component Analysis (PCA); and (c) the novel concept of virtual landmarks.

  3. Fast ground filtering for TLS data via Scanline Density Analysis

    NASA Astrophysics Data System (ADS)

    Che, Erzhuo; Olsen, Michael J.

    2017-07-01

    Terrestrial Laser Scanning (TLS) efficiently collects 3D information based on lidar (light detection and ranging) technology. TLS has been widely used in topographic mapping, engineering surveying, forestry, industrial facilities, cultural heritage, and so on. Ground filtering is a common procedure in lidar data processing, which separates the point cloud data into ground points and non-ground points. Effective ground filtering is helpful for subsequent procedures such as segmentation, classification, and modeling. Numerous ground filtering algorithms have been developed for Airborne Laser Scanning (ALS) data. However, many of these are error prone in application to TLS data because of its different angle of view and highly variable resolution. Further, many ground filtering techniques are limited in application within challenging topography and experience difficulty coping with some objects such as short vegetation, steep slopes, and so forth. Lastly, due to the large size of point cloud data, operations such as data traversing, multiple iterations, and neighbor searching significantly affect the computation efficiency. In order to overcome these challenges, we present an efficient ground filtering method for TLS data via a Scanline Density Analysis, which is very fast because it exploits the grid structure storing TLS data. The process first separates the ground candidates, density features, and unidentified points based on an analysis of point density within each scanline. Second, a region growth using the scan pattern is performed to cluster the ground candidates and further refine the ground points (clusters). In the experiment, the effectiveness, parameter robustness, and efficiency of the proposed method is demonstrated with datasets collected from an urban scene and a natural scene, respectively.

  4. Elastohydrodynamic lubrication theory

    NASA Technical Reports Server (NTRS)

    Hamrock, B. J.; Dowson, D.

    1982-01-01

    The isothermal elastohydrodynamic lubrication (EHL) of a point contact was analyzed numerically by simultaneously solving the elasticity and Reynolds equations. In the elasticity analysis the contact zone was divided into equal rectangular areas, and it was assumed that a uniform pressure was applied over each area. In the numerical analysis of the Reynolds equation, a phi analysis (where phi is equal to the pressure times the film thickness to the 3/2 power) was used to help the relaxation process. The EHL point contact analysis is applicable for the entire range of elliptical parameters and is valid for any combination of rolling and sliding within the contact.

  5. Systems view of adipogenesis via novel omics-driven and tissue-specific activity scoring of network functional modules

    NASA Astrophysics Data System (ADS)

    Nassiri, Isar; Lombardo, Rosario; Lauria, Mario; Morine, Melissa J.; Moyseos, Petros; Varma, Vijayalakshmi; Nolen, Greg T.; Knox, Bridgett; Sloper, Daniel; Kaput, Jim; Priami, Corrado

    2016-07-01

    The investigation of the complex processes involved in cellular differentiation must be based on unbiased, high throughput data processing methods to identify relevant biological pathways. A number of bioinformatics tools are available that can generate lists of pathways ranked by statistical significance (i.e. by p-value), while ideally it would be desirable to functionally score the pathways relative to each other or to other interacting parts of the system or process. We describe a new computational method (Network Activity Score Finder - NASFinder) to identify tissue-specific, omics-determined sub-networks and the connections with their upstream regulator receptors to obtain a systems view of the differentiation of human adipocytes. Adipogenesis of human SBGS pre-adipocyte cells in vitro was monitored with a transcriptomic data set comprising six time points (0, 6, 48, 96, 192, 384 hours). To elucidate the mechanisms of adipogenesis, NASFinder was used to perform time-point analysis by comparing each time point against the control (0 h) and time-lapse analysis by comparing each time point with the previous one. NASFinder identified the coordinated activity of seemingly unrelated processes between each comparison, providing the first systems view of adipogenesis in culture. NASFinder has been implemented into a web-based, freely available resource associated with novel, easy to read visualization of omics data sets and network modules.

  6. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning.

    PubMed

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-03-15

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood.

  7. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    NASA Astrophysics Data System (ADS)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  8. Discovering Implicit Networks from Point Process Data

    DTIC Science & Technology

    2013-08-03

    Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 SOCIAL NETWORK ANALYSIS Szell et al, Nature 2012 Saturday, August 3, 13 (a) Adjacency...processes: ‣ Seismology ‣ Epidemiology ‣ Economics ‣ Modeling dependence is challenging - “beyond Poisson” ‣ Strauss and Gibbs Processes ‣ Determinantal

  9. A Practical Decision-Analysis Process for Forest Ecosystem Management

    Treesearch

    H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery

    2000-01-01

    Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...

  10. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Process controls. 120.24 Section 120.24 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process...

  11. Competing spreading processes on multiplex networks: awareness and epidemics.

    PubMed

    Granell, Clara; Gómez, Sergio; Arenas, Alex

    2014-07-01

    Epidemiclike spreading processes on top of multilayered interconnected complex networks reveal a rich phase diagram of intertwined competition effects. A recent study by the authors [C. Granell et al., Phys. Rev. Lett. 111, 128701 (2013).] presented an analysis of the interrelation between two processes accounting for the spreading of an epidemic, and the spreading of information awareness to prevent infection, on top of multiplex networks. The results in the case in which awareness implies total immunization to the disease revealed the existence of a metacritical point at which the critical onset of the epidemics starts, depending on completion of the awareness process. Here we present a full analysis of these critical properties in the more general scenario where the awareness spreading does not imply total immunization, and where infection does not imply immediate awareness of it. We find the critical relation between the two competing processes for a wide spectrum of parameters representing the interaction between them. We also analyze the consequences of a massive broadcast of awareness (mass media) on the final outcome of the epidemic incidence. Importantly enough, the mass media make the metacritical point disappear. The results reveal that the main finding, i.e., existence of a metacritical point, is rooted in the competition principle and holds for a large set of scenarios.

  12. Audible acoustics in high-shear wet granulation: application of frequency filtering.

    PubMed

    Hansuld, Erin M; Briens, Lauren; McCann, Joe A B; Sayani, Amyn

    2009-08-13

    Previous work has shown analysis of audible acoustic emissions from high-shear wet granulation has potential as a technique for end-point detection. In this research, audible acoustic emissions (AEs) from three different formulations were studied to further develop this technique as a process analytical technology. Condenser microphones were attached to three different locations on a PMA-10 high-shear granulator (air exhaust, bowl and motor) to target different sound sources. Size, flowability and tablet break load data was collected to support formulator end-point ranges and interpretation of AE analysis. Each formulation had a unique total power spectral density (PSD) profile that was sensitive to granule formation and end-point. Analyzing total PSD in 10 Hz segments identified profiles with reduced run variability and distinct maxima and minima suitable for routine granulation monitoring and end-point control. A partial least squares discriminant analysis method was developed to automate selection of key 10 Hz frequency groups using variable importance to projection. The results support use of frequency refinement as a way forward in the development of acoustic emission analysis for granulation monitoring and end-point control.

  13. Dynamical Interplay between Awareness and Epidemic Spreading in Multiplex Networks

    NASA Astrophysics Data System (ADS)

    Granell, Clara; Gómez, Sergio; Arenas, Alex

    2013-09-01

    We present the analysis of the interrelation between two processes accounting for the spreading of an epidemic, and the information awareness to prevent its infection, on top of multiplex networks. This scenario is representative of an epidemic process spreading on a network of persistent real contacts, and a cyclic information awareness process diffusing in the network of virtual social contacts between the same individuals. The topology corresponds to a multiplex network where two diffusive processes are interacting affecting each other. The analysis using a microscopic Markov chain approach reveals the phase diagram of the incidence of the epidemics and allows us to capture the evolution of the epidemic threshold depending on the topological structure of the multiplex and the interrelation with the awareness process. Interestingly, the critical point for the onset of the epidemics has a critical value (metacritical point) defined by the awareness dynamics and the topology of the virtual network, from which the onset increases and the epidemics incidence decreases.

  14. Dynamical interplay between awareness and epidemic spreading in multiplex networks.

    PubMed

    Granell, Clara; Gómez, Sergio; Arenas, Alex

    2013-09-20

    We present the analysis of the interrelation between two processes accounting for the spreading of an epidemic, and the information awareness to prevent its infection, on top of multiplex networks. This scenario is representative of an epidemic process spreading on a network of persistent real contacts, and a cyclic information awareness process diffusing in the network of virtual social contacts between the same individuals. The topology corresponds to a multiplex network where two diffusive processes are interacting affecting each other. The analysis using a microscopic Markov chain approach reveals the phase diagram of the incidence of the epidemics and allows us to capture the evolution of the epidemic threshold depending on the topological structure of the multiplex and the interrelation with the awareness process. Interestingly, the critical point for the onset of the epidemics has a critical value (metacritical point) defined by the awareness dynamics and the topology of the virtual network, from which the onset increases and the epidemics incidence decreases.

  15. Critical point analysis of phase envelope diagram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile,more » dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.« less

  16. Delivering Cognitive Processing Therapy in a Community Health Setting: The Influence of Latino Culture and Community Violence on Posttraumatic Cognitions

    PubMed Central

    Marques, Luana; Eustis, Elizabeth H.; Dixon, Louise; Valentine, Sarah E.; Borba, Christina; Simon, Naomi; Kaysen, Debra; Wiltsey-Stirman, Shannon

    2015-01-01

    Despite the applicability of Cognitive Processing Therapy (CPT) for Posttraumatic Stress Disorder (PTSD) to addressing sequelae of a range of traumatic events, few studies have evaluated whether the treatment itself is applicable across diverse populations. The present study examined differences and similarities amongst non-Latino, Latino Spanish-speaking, and Latino English-speaking clients in rigid beliefs – or “stuck points” – associated with PTSD symptoms in a sample of community mental health clients. We utilized the procedures of content analysis to analyze stuck point logs and impact statements of 29 participants enrolled in a larger implementation trial for CPT. Findings indicated that the content of stuck points was similar across Latino and non-Latino clients, although fewer total stuck points were identified for Latino clients compared to non-Latino clients. Given that identification of stuck points is central to implementing CPT, difficulty identifying stuck points could pose significant challenges for implementing CPT among Latino clients and warrants further examination. Thematic analysis of impact statements revealed the importance of family, religion, and the urban context (e.g., poverty, violence exposure) in understanding how clients organize beliefs and emotions associated with trauma. Clinical recommendations for implementing CPT in community settings and the identification of stuck points are provided. PMID:25961865

  17. Analysis of Spatial Point Patterns in Nuclear Biology

    PubMed Central

    Weston, David J.; Adams, Niall M.; Russell, Richard A.; Stephens, David A.; Freemont, Paul S.

    2012-01-01

    There is considerable interest in cell biology in determining whether, and to what extent, the spatial arrangement of nuclear objects affects nuclear function. A common approach to address this issue involves analyzing a collection of images produced using some form of fluorescence microscopy. We assume that these images have been successfully pre-processed and a spatial point pattern representation of the objects of interest within the nuclear boundary is available. Typically in these scenarios, the number of objects per nucleus is low, which has consequences on the ability of standard analysis procedures to demonstrate the existence of spatial preference in the pattern. There are broadly two common approaches to look for structure in these spatial point patterns. First a spatial point pattern for each image is analyzed individually, or second a simple normalization is performed and the patterns are aggregated. In this paper we demonstrate using synthetic spatial point patterns drawn from predefined point processes how difficult it is to distinguish a pattern from complete spatial randomness using these techniques and hence how easy it is to miss interesting spatial preferences in the arrangement of nuclear objects. The impact of this problem is also illustrated on data related to the configuration of PML nuclear bodies in mammalian fibroblast cells. PMID:22615822

  18. Point processes in arbitrary dimension from fermionic gases, random matrix theory, and number theory

    NASA Astrophysics Data System (ADS)

    Torquato, Salvatore; Scardicchio, A.; Zachary, Chase E.

    2008-11-01

    It is well known that one can map certain properties of random matrices, fermionic gases, and zeros of the Riemann zeta function to a unique point process on the real line \\mathbb {R} . Here we analytically provide exact generalizations of such a point process in d-dimensional Euclidean space \\mathbb {R}^d for any d, which are special cases of determinantal processes. In particular, we obtain the n-particle correlation functions for any n, which completely specify the point processes in \\mathbb {R}^d . We also demonstrate that spin-polarized fermionic systems in \\mathbb {R}^d have these same n-particle correlation functions in each dimension. The point processes for any d are shown to be hyperuniform, i.e., infinite wavelength density fluctuations vanish, and the structure factor (or power spectrum) S(k) has a non-analytic behavior at the origin given by S(k)~|k| (k \\rightarrow 0 ). The latter result implies that the pair correlation function g2(r) tends to unity for large pair distances with a decay rate that is controlled by the power law 1/rd+1, which is a well-known property of bosonic ground states and more recently has been shown to characterize maximally random jammed sphere packings. We graphically display one-and two-dimensional realizations of the point processes in order to vividly reveal their 'repulsive' nature. Indeed, we show that the point processes can be characterized by an effective 'hard core' diameter that grows like the square root of d. The nearest-neighbor distribution functions for these point processes are also evaluated and rigorously bounded. Among other results, this analysis reveals that the probability of finding a large spherical cavity of radius r in dimension d behaves like a Poisson point process but in dimension d+1, i.e., this probability is given by exp[-κ(d)rd+1] for large r and finite d, where κ(d) is a positive d-dependent constant. We also show that as d increases, the point process behaves effectively like a sphere packing with a coverage fraction of space that is no denser than 1/2d. This coverage fraction has a special significance in the study of sphere packings in high-dimensional Euclidean spaces.

  19. Fulfillment of GMP standard, halal standard, and applying HACCP for production process of beef floss (Case study: Ksatria enterprise)

    NASA Astrophysics Data System (ADS)

    A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan

    2018-02-01

    Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.

  20. [Design of a HACCP Plan for the Gouda-type cheesemaking process in a milk processing plant].

    PubMed

    Dávila, Jacqueline; Reyes, Genara; Corzo, Otoniel

    2006-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a preventive and systematic method used to identify, assess and control of the hazards related with raw material, ingredients, processing, marketing and intended consumer in order to assure the safety of the food. The aim of this study was to design a HACCP plan for implementing in a Gouda-type cheese-making process in a dairy processing plant. The used methodology was based in the application of the seven principles of the HACCP, the information from the plant about the compliment of the pre-requisite programs (70-80%), the experience of the HACCP team and the sequence of stages settles down by the COVENIN standard 3802 for implementing the HACCP system. A HACCP plan was proposed with the scope, the selection of HACCP team, the description of the product and the intended use, the flow diagram of the process, the hazard analysis and the control table of the plan with the critical control points (CCP). The following CCP were identified in the process: pasteurization, coagulation and ripening.

  1. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    NASA Astrophysics Data System (ADS)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  2. Online elemental analysis of process gases with ICP-OES: A case study on waste wood combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wellinger, Marco, E-mail: marco.wellinger@gmail.com; Ecole Polytechnique Federale de Lausanne; Wochele, Joerg

    2012-10-15

    Highlights: Black-Right-Pointing-Pointer Simultaneous measurements of 23 elements in process gases of a waste wood combustor. Black-Right-Pointing-Pointer Mobile ICP spectrometer allows measurements of high quality at industrial plants. Black-Right-Pointing-Pointer Continuous online measurements with high temporal resolution. Black-Right-Pointing-Pointer Linear correlations among element concentrations in the raw flue gas were detected. Black-Right-Pointing-Pointer Novel sampling and calibration methods for ICP-OES analysis of process gases. - Abstract: A mobile sampling and measurement system for the analysis of gaseous and liquid samples in the field was developed. An inductively coupled plasma optical emission spectrometer (ICP-OES), which is built into a van, was used as detector. Themore » analytical system was calibrated with liquid and/or gaseous standards. It was shown that identical mass flows of either gaseous or liquid standards resulted in identical ICP-OES signal intensities. In a field measurement campaign trace and minor elements in the raw flue gas of a waste wood combustor were monitored. Sampling was performed with a highly transport efficient liquid quench system, which allowed to observe temporal variations in the elemental process gas composition. After a change in feedstock an immediate change of the element concentrations in the flue gas was detected. A comparison of the average element concentrations during the combustion of the two feedstocks showed a high reproducibility for matrix elements that are expected to be present in similar concentrations. On the other hand elements that showed strong differences in their concentration in the feedstock were also represented by a higher concentration in the flue gas. Following the temporal variations of different elements revealed strong correlations between a number of elements, such as chlorine with sodium, potassium and zinc, as well as arsenic with lead, and calcium with strontium.« less

  3. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  4. [Purifying process of gynostemma pentaphyllum saponins based on "adjoint marker" online control technology and identification of their compositions by UPLC-QTOF-MS].

    PubMed

    Fan, Dong-Dong; Kuang, Yan-Hui; Dong, Li-Hua; Ye, Xiao; Chen, Liang-Mian; Zhang, Dong; Ma, Zhen-Shan; Wang, Jin-Yu; Zhu, Jing-Jing; Wang, Zhi-Min; Wang, De-Qin; Li, Chu-Yuan

    2017-04-01

    To optimize the purification process of gynostemma pentaphyllum saponins (GPS) based on "adjoint marker" online control technology with GPS as the testing index. UPLC-QTOF-MS technology was used for qualitative analysis. "Adjoint marker" online control results showed that the end point of load sample was that the UV absorbance of effluent liquid was equal to half of that of load sample solution, and the absorbance was basically stable when the end point was stable. In UPLC-QTOF-MS qualitative analysis, 16 saponins were identified from GPS, including 13 known gynostemma saponins and 3 new saponins. This optimized method was proved to be simple, scientific, reasonable, easy for online determination, real-time record, and can be better applied to the mass production and automation of production. The results of qualitative analysis indicated that the "adjoint marker" online control technology can well retain main efficacy components of medicinal materials, and provide analysis tools for the process control and quality traceability. Copyright© by the Chinese Pharmaceutical Association.

  5. Wind Tunnel Simulations of the Mock Urban Setting Test - Experimental Procedures and Data Analysis

    DTIC Science & Technology

    2004-07-01

    depends on the subjective choice of points to include in the constant stress region. This is demonstrated by the marked difference in the slope for the two...designed explicitly for the analysis of time series and signal processing , particularly for atmospheric dispersion ex- periments. The scripts developed...below. Processing scripts are available for all these analyses in the /scripts directory. All files of figures and processed data resulting from these

  6. Validation of acid washes as critical control points in hazard analysis and critical control point systems.

    PubMed

    Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E

    2000-12-01

    A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P < 0.05) than those on pre-acid-washed carcasses throughout all processing steps. Total bacteria, coliforms, and generic E. coli enumerated on ground beef samples were more than 1 log cycle lower than those reported in the U.S. Department of Agriculture Baseline data. This study suggests that acid washes may be effective CCPs in HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing.

  7. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    PubMed

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  8. The Use of Uas for Rapid 3d Mapping in Geomatics Education

    NASA Astrophysics Data System (ADS)

    Teo, Tee-Ann; Tian-Yuan Shih, Peter; Yu, Sz-Cheng; Tsai, Fuan

    2016-06-01

    With the development of technology, UAS is an advance technology to support rapid mapping for disaster response. The aim of this study is to develop educational modules for UAS data processing in rapid 3D mapping. The designed modules for this study are focused on UAV data processing from available freeware or trial software for education purpose. The key modules include orientation modelling, 3D point clouds generation, image georeferencing and visualization. The orientation modelling modules adopts VisualSFM to determine the projection matrix for each image station. Besides, the approximate ground control points are measured from OpenStreetMap for absolute orientation. The second module uses SURE and the orientation files from previous module for 3D point clouds generation. Then, the ground point selection and digital terrain model generation can be archived by LAStools. The third module stitches individual rectified images into a mosaic image using Microsoft ICE (Image Composite Editor). The last module visualizes and measures the generated dense point clouds in CloudCompare. These comprehensive UAS processing modules allow the students to gain the skills to process and deliver UAS photogrammetric products in rapid 3D mapping. Moreover, they can also apply the photogrammetric products for analysis in practice.

  9. ASIC For Complex Fixed-Point Arithmetic

    NASA Technical Reports Server (NTRS)

    Petilli, Stephen G.; Grimm, Michael J.; Olson, Erlend M.

    1995-01-01

    Application-specific integrated circuit (ASIC) performs 24-bit, fixed-point arithmetic operations on arrays of complex-valued input data. High-performance, wide-band arithmetic logic unit (ALU) designed for use in computing fast Fourier transforms (FFTs) and for performing ditigal filtering functions. Other applications include general computations involved in analysis of spectra and digital signal processing.

  10. Deformation analysis of a sinkhole in Thuringia using multi-temporal multi-view stereo 3D reconstruction data

    NASA Astrophysics Data System (ADS)

    Petschko, Helene; Goetz, Jason; Schmidt, Sven

    2017-04-01

    Sinkholes are a serious threat on life, personal property and infrastructure in large parts of Thuringia. Over 9000 sinkholes have been documented by the Geological Survey of Thuringia, which are caused by collapsing hollows which formed due to solution processes within the local bedrock material. However, little is known about surface processes and their dynamics at the flanks of the sinkhole once the sinkhole has shaped. These processes are of high interest as they might lead to dangerous situations at or within the vicinity of the sinkhole. Our objective was the analysis of these deformations over time in 3D by applying terrestrial photogrammetry with a simple DSLR camera. Within this study, we performed an analysis of deformations within a sinkhole close to Bad Frankenhausen (Thuringia) using terrestrial photogrammetry and multi-view stereo 3D reconstruction to obtain a 3D point cloud describing the morphology of the sinkhole. This was performed for multiple data collection campaigns over a 6-month period. The photos of the sinkhole were taken with a Nikon D3000 SLR Camera. For the comparison of the point clouds the Multiscale Model to Model Comparison (M3C2) plugin of the software CloudCompare was used. It allows to apply advanced methods of point cloud difference calculation which considers the co-registration error between two point clouds for assessing the significance of the calculated difference (given in meters). Three Styrofoam cuboids of known dimensions (16 cm wide/29 cm high/11.5 cm deep) were placed within the sinkhole to test the accuracy of the point cloud difference calculation. The multi-view stereo 3D reconstruction was performed with Agisoft Photoscan. Preliminary analysis indicates that about 26% of the sinkhole showed changes exceeding the co-registration error of the point clouds. The areas of change can mainly be detected on the flanks of the sinkhole and on an earth pillar that formed in the center of the sinkhole. These changes describe toppling (positive change of a few centimeters at the earth pillar) and a few erosion processes along the flanks (negative change of a few centimeters) compared to the first date of data acquisition. Additionally, the Styrofoam cuboids have successfully been detected with an observed depth change of 10 cm. However, the limitations of this approach related to the co-registration of the point clouds and data acquisition (windy conditions) have to be analyzed in more detail.

  11. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    PubMed

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  12. Application of Failure Mode and Effect Analysis (FMEA), cause and effect analysis, and Pareto diagram in conjunction with HACCP to a corn curl manufacturing plant.

    PubMed

    Varzakas, Theodoros H; Arvanitoyannis, Ioannis S

    2007-01-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA.

  13. A method for the processing and analysis of digital terrain elevation data. [Shiprock and Gallup Quadrangles, Arizona and New Mexico

    NASA Technical Reports Server (NTRS)

    Junkin, B. G. (Principal Investigator)

    1979-01-01

    A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.

  14. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Known Good Substrates Year 1

    DTIC Science & Technology

    2007-12-05

    yield record setting carrier lifetime values and very low concentrations of point defects. Epiwafers delivered for fabrication of RF static induction ...boules and on improved furnace uniformity (adding rotation, etc.). Pareto analysis was performed on wafer yield loss at the start of every quarter...100mm PVT process. Work focused on modeling the process for longer (50 mm) boules and on improved furnace uniformity. Pareto analysis was performed

  16. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  17. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  18. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  19. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  20. Analysis and Evaluation of the LANDSAT-4 MSS and TM Sensors and Ground Data Processing Systems: Early Results

    NASA Technical Reports Server (NTRS)

    Bernstein, R.; Lotspiech, J. B.

    1985-01-01

    The MSS and TM sensor performances were evaluated by studying both the sensors and the characteristics of the data. Information content analysis, image statistics, band-to-band registration, the presence of failed or failing detectors, and sensor resolution are discussed. The TM data were explored from the point of view of adequacy of the ground processing and improvements that could be made to compensate for sensor problems and deficiencies. Radiometric correction processing, compensation for a failed detector, and geometric correction processing are also considered.

  1. Fractal analysis of multiscale spatial autocorrelation among point data

    USGS Publications Warehouse

    De Cola, L.

    1991-01-01

    The analysis of spatial autocorrelation among point-data quadrats is a well-developed technique that has made limited but intriguing use of the multiscale aspects of pattern. In this paper are presented theoretical and algorithmic approaches to the analysis of aggregations of quadrats at or above a given density, in which these sets are treated as multifractal regions whose fractal dimension, D, may vary with phenomenon intensity, scale, and location. The technique is illustrated with Matui's quadrat house-count data, which yield measurements consistent with a nonautocorrelated simulated Poisson process but not with an orthogonal unit-step random walk. The paper concludes with a discussion of the implications of such analysis for multiscale geographic analysis systems. -Author

  2. A Semiparametric Change-Point Regression Model for Longitudinal Observations.

    PubMed

    Xing, Haipeng; Ying, Zhiliang

    2012-12-01

    Many longitudinal studies involve relating an outcome process to a set of possibly time-varying covariates, giving rise to the usual regression models for longitudinal data. When the purpose of the study is to investigate the covariate effects when experimental environment undergoes abrupt changes or to locate the periods with different levels of covariate effects, a simple and easy-to-interpret approach is to introduce change-points in regression coefficients. In this connection, we propose a semiparametric change-point regression model, in which the error process (stochastic component) is nonparametric and the baseline mean function (functional part) is completely unspecified, the observation times are allowed to be subject-specific, and the number, locations and magnitudes of change-points are unknown and need to be estimated. We further develop an estimation procedure which combines the recent advance in semiparametric analysis based on counting process argument and multiple change-points inference, and discuss its large sample properties, including consistency and asymptotic normality, under suitable regularity conditions. Simulation results show that the proposed methods work well under a variety of scenarios. An application to a real data set is also given.

  3. Orientational analysis of planar fibre systems observed as a Poisson shot-noise process.

    PubMed

    Kärkkäinen, Salme; Lantuéjoul, Christian

    2007-10-01

    We consider two-dimensional fibrous materials observed as a digital greyscale image. The problem addressed is to estimate the orientation distribution of unobservable thin fibres from a greyscale image modelled by a planar Poisson shot-noise process. The classical stereological approach is not straightforward, because the point intensities of thin fibres along sampling lines may not be observable. For such cases, Kärkkäinen et al. (2001) suggested the use of scaled variograms determined from grey values along sampling lines in several directions. Their method is based on the assumption that the proportion between the scaled variograms and point intensities in all directions of sampling lines is constant. This assumption is proved to be valid asymptotically for Boolean models and dead leaves models, under some regularity conditions. In this work, we derive the scaled variogram and its approximations for a planar Poisson shot-noise process using the modified Bessel function. In the case of reasonable high resolution of the observed image, the scaled variogram has an approximate functional relation to the point intensity, and in the case of high resolution the relation is proportional. As the obtained relations are approximative, they are tested on simulations. The existing orientation analysis method based on the proportional relation is further experimented on images with different resolutions. The new result, the asymptotic proportionality between the scaled variograms and the point intensities for a Poisson shot-noise process, completes the earlier results for the Boolean models and for the dead leaves models.

  4. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning

    PubMed Central

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-01-01

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood. PMID:28294963

  5. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    PubMed

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  6. Effect of Pointing Error on the BER Performance of an Optical CDMA FSO Link with SIK Receiver

    NASA Astrophysics Data System (ADS)

    Nazrul Islam, A. K. M.; Majumder, S. P.

    2017-12-01

    An analytical approach is presented for an optical code division multiple access (OCDMA) system over free space optical (FSO) channel considering the effect of pointing error between the transmitter and the receiver. Analysis is carried out with an optical sequence inverse keying (SIK) correlator receiver with intensity modulation and direct detection (IM/DD) to find the bit error rate (BER) with pointing error. The results are evaluated numerically in terms of signal-to-noise plus multi-access interference (MAI) ratio, BER and power penalty due to pointing error. It is noticed that the OCDMA FSO system is highly affected by pointing error with significant power penalty at a BER of 10-6 and 10-9. For example, penalty at BER 10-9 is found to be 9 dB corresponding to normalized pointing error of 1.4 for 16 users with processing gain of 256 and is reduced to 6.9 dB when the processing gain is increased to 1,024.

  7. Universal statistics of terminal dynamics before collapse

    NASA Astrophysics Data System (ADS)

    Lenner, Nicolas; Eule, Stephan; Wolf, Fred

    Recent biological developments have both drastically increased the precision as well as amount of generated data, allowing for a switching from pure mean value characterization of the process under consideration to an analysis of the whole ensemble, exploiting the stochastic nature of biology. We focus on the general class of non-equilibrium processes with distinguished terminal points as can be found in cell fate decision, check points or cognitive neuroscience. Aligning the data to a terminal point (e.g. represented as an absorbing boundary) allows to device a general methodology to characterize and reverse engineer the terminating history. Using a small noise approximation we derive mean variance and covariance of the aligned data for general finite time singularities.

  8. Information Presentation in Decision and Risk Analysis: Answered, Partly Answered, and Unanswered Questions.

    PubMed

    Keller, L Robin; Wang, Yitong

    2017-06-01

    For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.

  9. Process for structural geologic analysis of topography and point data

    DOEpatents

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  10. The problem of the periodicity of the epidemic process. [solar activity effects on diphtheria outbreak

    NASA Technical Reports Server (NTRS)

    Yagodinskiy, V. N.; Konovalenko, Z. P.; Druzhinin, I. P.

    1974-01-01

    An analysis of data from epidemics makes it possible to determine their principal causes, governed by environmental factors (solar activity, etc.) The results of an analysis of the periodicity of the epidemic process in the case of diphtheria are presented which was conducted with the aid of autocorrelation and spectral methods of analysis. Numerical data (annual figures) are used on the dynamics of diphtheria in 50 regions (points) with a total duration of 2,777 years.

  11. Analysis of parameters for technological equipment of parallel kinematics based on rods of variable length for processing accuracy assurance

    NASA Astrophysics Data System (ADS)

    Koltsov, A. G.; Shamutdinov, A. H.; Blokhin, D. A.; Krivonos, E. V.

    2018-01-01

    A new classification of parallel kinematics mechanisms on symmetry coefficient, being proportional to mechanism stiffness and accuracy of the processing product using the technological equipment under study, is proposed. A new version of the Stewart platform with a high symmetry coefficient is presented for analysis. The workspace of the mechanism under study is described, this space being a complex solid figure. The workspace end points are reached by the center of the mobile platform which moves in parallel related to the base plate. Parameters affecting the processing accuracy, namely the static and dynamic stiffness, natural vibration frequencies are determined. The capability assessment of the mechanism operation under various loads, taking into account resonance phenomena at different points of the workspace, was conducted. The study proved that stiffness and therefore, processing accuracy with the use of the above mentioned mechanisms are comparable with the stiffness and accuracy of medium-sized series-produced machines.

  12. Point source detection in infrared astronomical surveys

    NASA Technical Reports Server (NTRS)

    Pelzmann, R. F., Jr.

    1977-01-01

    Data processing techniques useful for infrared astronomy data analysis systems are reported. This investigation is restricted to consideration of data from space-based telescope systems operating as survey instruments. In this report the theoretical background for specific point-source detection schemes is completed, and the development of specific algorithms and software for the broad range of requirements is begun.

  13. Forest structure analysis combining laser scanning with digital airborne photogrammetry

    NASA Astrophysics Data System (ADS)

    Lissak, Candide; Onda, Yuichi; Kato, Hiroaki

    2017-04-01

    The interest of Light Detection and Ranging (LiDAR) for vegetation structure analysis has been demonstrated in several research context. Indeed, airborne or ground Lidar surveys can provide detailed three-dimensional data of the forest structure from understorey forest to the canopy. To characterize at different timescale the vegetation components in dense cedar forests we can combine several sources point clouds from Lidar survey and photogrammetry data. For our study, Terrestrial Laser Scanning (TLS-Leica ScanStation C10 processed with Cyclone software) have been lead in three forest areas (≈ 200m2 each zone) mainly composed of japanese cedar (Japonica cryptomeria), in the region of Fukushima (Japan). The study areas are characterized by various vegetation densities. For the 3 areas, Terrestrial laser scanning has been performed from several location points and several heights. Various floors shootings (ground, 4m, 6m and 18m high) were able with the use of a several meters high tower implanted to study the canopy evolution following the Fukushima Daiishi nuclear power plant accident. The combination of all scanners provides a very dense 3D point cloud of ground and canopy structure (average 300 000 000 points). For the Tochigi forest area, a first test of a low-cost Unmanned Aerial Vehicle (UAV) photogrammetry has been lead and calibrated by ground GPS measurements to determine the coordinates of points. TLS combined to UAV photogrammetry make it possible to obtain information on vertical and horizontal structure of the Tochigi forest. This combination of technologies will allow the forest structure mapping, morphometry analysis and the assessment of biomass volume evolution from multi-temporal point clouds. In our research, we used a low-cost UAV 3 Advanced (200 m2 cover, 1300 pictures...). Data processing were performed using PotoScan Pro software to obtain a very dense point clouds to combine to TLS data set. This low-cost UAV photogrammetry data has been successfully used to derive information on the canopy cover. The purpose of this poster is to present the usability of combined remote sensing methods for forest structure analysis and 3D model reconstitution for a trend analysis of the forest changes.

  14. nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab

    PubMed Central

    Cajigas, I.; Malik, W.Q.; Brown, E.N.

    2012-01-01

    Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419

  15. Competing spreading processes on multiplex networks: Awareness and epidemics

    NASA Astrophysics Data System (ADS)

    Granell, Clara; Gómez, Sergio; Arenas, Alex

    2014-07-01

    Epidemiclike spreading processes on top of multilayered interconnected complex networks reveal a rich phase diagram of intertwined competition effects. A recent study by the authors [C. Granell et al., Phys. Rev. Lett. 111, 128701 (2013)., 10.1103/PhysRevLett.111.128701] presented an analysis of the interrelation between two processes accounting for the spreading of an epidemic, and the spreading of information awareness to prevent infection, on top of multiplex networks. The results in the case in which awareness implies total immunization to the disease revealed the existence of a metacritical point at which the critical onset of the epidemics starts, depending on completion of the awareness process. Here we present a full analysis of these critical properties in the more general scenario where the awareness spreading does not imply total immunization, and where infection does not imply immediate awareness of it. We find the critical relation between the two competing processes for a wide spectrum of parameters representing the interaction between them. We also analyze the consequences of a massive broadcast of awareness (mass media) on the final outcome of the epidemic incidence. Importantly enough, the mass media make the metacritical point disappear. The results reveal that the main finding, i.e., existence of a metacritical point, is rooted in the competition principle and holds for a large set of scenarios.

  16. Application of ICH Q9 Quality Risk Management Tools for Advanced Development of Hot Melt Coated Multiparticulate Systems.

    PubMed

    Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh

    2017-01-01

    This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  17. Analysis of the medication-use process in North American hospital systems: underlining key points for adoption to improve patient safety in French hospitals.

    PubMed

    Brouard, Agnes; Fagon, Jean Yves; Daniels, Charles E

    2011-01-01

    This project was designed to underline any actions relative to medication error prevention and patient safety improvement setting up in North American hospitals which could be implemented in French Parisian hospitals. A literature research and analysis of medication-use process in the North American hospitals and a validation survey of hospital pharmacist managers in the San Diego area was performed to assess main points of hospital medication-use process. Literature analysis, survey analysis of respondents highlighted main differences between the two countries at three levels: nationwide, hospital level and pharmaceutical service level. According to this, proposal development to optimize medication-use process in the French system includes the following topics: implementation of an expanded use of information technology and robotics; increase pharmaceutical human resources allowing expansion of clinical pharmacy activities; focus on high-risk medications and high-risk patient populations; develop a collective sense of responsibility for medication error prevention in hospital settings, involving medical, pharmaceutical and administrative teams. Along with a strong emphasis that should be put on the identified topics to improve the quality and safety of hospital care in France, consideration of patient safety as a priority at a nationwide level needs to be reinforced.

  18. Visualization and Image Analysis of Yeast Cells.

    PubMed

    Bagley, Steve

    2016-01-01

    When converting real-life data via visualization to numbers and then onto statistics the whole system needs to be considered so that conversion from the analogue to the digital is accurate and repeatable. Here we describe the points to consider when approaching yeast cell analysis visualization, processing, and analysis of a population by screening techniques.

  19. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  20. Erosion and Channel Incision Analysis with High-Resolution Lidar

    NASA Astrophysics Data System (ADS)

    Potapenko, J.; Bookhagen, B.

    2013-12-01

    High-resolution LiDAR (LIght Detection And Ranging) provides a new generation of sub-meter topographic data that is still to be fully exploited by the Earth science communities. We make use of multi-temporal airborne and terrestrial lidar scans in the south-central California and Santa Barbara area. Specifically, we have investigated the Mission Canyon and Channel Islands regions from 2009-2011 to study changes in erosion and channel incision on the landscape. In addition to gridding the lidar data into digital elevation models (DEMs), we also make use of raw lidar point clouds and triangulated irregular networks (TINs) for detailed analysis of heterogeneously spaced topographic data. Using recent advancements in lidar point cloud processing from information technology disciplines, we have employed novel lidar point cloud processing and feature detection algorithms to automate the detection of deeply incised channels and gullies, vegetation, and other derived metrics (e.g. estimates of eroded volume). Our analysis compares topographically-derived erosion volumes to field-derived cosmogenic radionuclide age and in-situ sediment-flux measurements. First results indicate that gully erosion accounts for up to 60% of the sediment volume removed from the Mission Canyon region. Furthermore, we observe that gully erosion and upstream arroyo propagation accelerated after fires, especially in regions where vegetation was heavily burned. The use of high-resolution lidar point cloud data for topographic analysis is still a novel method that needs more precedent and we hope to provide a cogent example of this approach with our research.

  1. Pre-processing by data augmentation for improved ellipse fitting.

    PubMed

    Kumar, Pankaj; Belchamber, Erika R; Miklavcic, Stanley J

    2018-01-01

    Ellipse fitting is a highly researched and mature topic. Surprisingly, however, no existing method has thus far considered the data point eccentricity in its ellipse fitting procedure. Here, we introduce the concept of eccentricity of a data point, in analogy with the idea of ellipse eccentricity. We then show empirically that, irrespective of ellipse fitting method used, the root mean square error (RMSE) of a fit increases with the eccentricity of the data point set. The main contribution of the paper is based on the hypothesis that if the data point set were pre-processed to strategically add additional data points in regions of high eccentricity, then the quality of a fit could be improved. Conditional validity of this hypothesis is demonstrated mathematically using a model scenario. Based on this confirmation we propose an algorithm that pre-processes the data so that data points with high eccentricity are replicated. The improvement of ellipse fitting is then demonstrated empirically in real-world application of 3D reconstruction of a plant root system for phenotypic analysis. The degree of improvement for different underlying ellipse fitting methods as a function of data noise level is also analysed. We show that almost every method tested, irrespective of whether it minimizes algebraic error or geometric error, shows improvement in the fit following data augmentation using the proposed pre-processing algorithm.

  2. Influence of nuclear power unit on decreasing emissions of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Stanek, Wojciech; Szargut, Jan; Kolenda, Zygmunt; Czarnowska, Lucyna

    2015-03-01

    The paper presents a comparison of selected power technologies from the point of view of emissions of greenhouse gases. Such evaluation is most often based only on analysis of direct emissions from combustion. However, the direct analysis does not show full picture of the problem as significant emissions of GHG appear also in the process of mining and transportation of fuel. It is demonstrated in the paper that comparison of power technologies from the GHG point of view has to be done using the cumulative calculus covering the whole cycle of fuel mining, processing, transportation and end-use. From this point of view coal technologies are in comparable level as gas technologies while nuclear power units are characterised with lowest GHG emissions. Mentioned technologies are compared from the point of view of GHG emissions in full cycle. Specific GHG cumulative emission factors per unit of generated electricity are determined. These factors have been applied to simulation of the influence of introduction of nuclear power units on decrease of GHG emissions in domestic scale. Within the presented simulations the prognosis of domestic power sector development according to the Polish energy policy till 2030 has been taken into account. The profitability of introduction of nuclear power units from the point of view of decreasing GHG emissions has been proved.

  3. Contextual Classification of Point Cloud Data by Exploiting Individual 3d Neigbourhoods

    NASA Astrophysics Data System (ADS)

    Weinmann, M.; Schmidt, A.; Mallet, C.; Hinz, S.; Rottensteiner, F.; Jutzi, B.

    2015-03-01

    The fully automated analysis of 3D point clouds is of great importance in photogrammetry, remote sensing and computer vision. For reliably extracting objects such as buildings, road inventory or vegetation, many approaches rely on the results of a point cloud classification, where each 3D point is assigned a respective semantic class label. Such an assignment, in turn, typically involves statistical methods for feature extraction and machine learning. Whereas the different components in the processing workflow have extensively, but separately been investigated in recent years, the respective connection by sharing the results of crucial tasks across all components has not yet been addressed. This connection not only encapsulates the interrelated issues of neighborhood selection and feature extraction, but also the issue of how to involve spatial context in the classification step. In this paper, we present a novel and generic approach for 3D scene analysis which relies on (i) individually optimized 3D neighborhoods for (ii) the extraction of distinctive geometric features and (iii) the contextual classification of point cloud data. For a labeled benchmark dataset, we demonstrate the beneficial impact of involving contextual information in the classification process and that using individual 3D neighborhoods of optimal size significantly increases the quality of the results for both pointwise and contextual classification.

  4. Modification and fixed-point analysis of a Kalman filter for orientation estimation based on 9D inertial measurement unit data.

    PubMed

    Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger

    2013-01-01

    A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.

  5. Ground-state proton decay of 69Br and implications for the rp -process 68Se waiting-point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, Andrew M; Shapira, Dan; Lynch, William

    2011-01-01

    The first direct measurement of the proton separation energy, S p , for the proton-unbound nucleus 69Br is reported. Of interest is the exponential dependence of the 2 p-capture rate on S p which can bypass the 68Se waiting-point in the astrophysical rp process. An analysis of the observed proton decay spectrum is given in terms of the 69Se mirror nucleus and the influence of S p is explored within the context of a single-zone X-ray burst model.

  6. 32 CFR 989.16 - Environmental impact statement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.16 Environmental impact statement. (a) Certain...) Development of major new weapons systems (at decision points that involve demonstration, validation...

  7. 32 CFR 989.16 - Environmental impact statement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.16 Environmental impact statement. (a) Certain...) Development of major new weapons systems (at decision points that involve demonstration, validation...

  8. 32 CFR 989.16 - Environmental impact statement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.16 Environmental impact statement. (a) Certain...) Development of major new weapons systems (at decision points that involve demonstration, validation...

  9. 32 CFR 989.16 - Environmental impact statement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.16 Environmental impact statement. (a) Certain...) Development of major new weapons systems (at decision points that involve demonstration, validation...

  10. Automated analysis of plethysmograms for functional studies of hemodynamics

    NASA Astrophysics Data System (ADS)

    Zatrudina, R. Sh.; Isupov, I. B.; Gribkov, V. Yu.

    2018-04-01

    The most promising method for the quantitative determination of cardiovascular tone indicators and of cerebral hemodynamics indicators is the method of impedance plethysmography. The accurate determination of these indicators requires the correct identification of the characteristic points in the thoracic impedance plethysmogram and the cranial impedance plethysmogram respectively. An algorithm for automatic analysis of these plethysmogram is presented. The algorithm is based on the hard temporal relationships between the phases of the cardiac cycle and the characteristic points of the plethysmogram. The proposed algorithm does not require estimation of initial data and selection of processing parameters. Use of the method on healthy subjects showed a very low detection error of characteristic points.

  11. Process Mechanics Analysis in Single Point Incremental Forming

    NASA Astrophysics Data System (ADS)

    Ambrogio, G.; Filice, L.; Fratini, L.; Micari, F.

    2004-06-01

    The request of highly differentiated products and the need of process flexibility have brought the researchers to focus the attention on innovative sheet forming processes. Industrial application of conventional processes is, in fact, economically convenient just for large scale productions; furthermore conventional processes do not allow to fully satisfy the mentioned demand of flexibility. In this contest, single point incremental forming (SPIF) is an innovative and flexible answer to market requests. The process is characterized by a peculiar process mechanics, being the sheet plastically deformed only through a localised stretching mechanism. Some recent experimental studies have shown that SPIF permits a relevant increase of formability limits, just as a consequence of the peculiar deformation mechanics. The research here addressed is focused on the theoretical investigation of process mechanics; the aim was to achieve a deeper understanding of basic phenomena involved in SPIF which justify the above mentioned formability enhancing.

  12. A study on using pre-forming blank in single point incremental forming process by finite element analysis

    NASA Astrophysics Data System (ADS)

    Abass, K. I.

    2016-11-01

    Single Point Incremental Forming process (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The edges of sheet material are clamped while the forming tool is moved along the tool path. The CNC milling machine is used to manufacturing the product. SPIF involves extensive plastic deformation and the description of the process is more complicated by highly nonlinear boundary conditions, namely contact and frictional effects have been accomplished. However, due to the complex nature of these models, numerical approaches dominated by Finite Element Analysis (FEA) are now in widespread use. The paper presents the data and main results of a study on effect of using preforming blank in SPIF through FEA. The considered SPIF has been studied under certain process conditions referring to the test work piece, tool, etc., applying ANSYS 11. The results show that the simulation model can predict an ideal profile of processing track, the behaviour of contact tool-workpiece, the product accuracy by evaluation its thickness, surface strain and the stress distribution along the deformed blank section during the deformation stages.

  13. 40 CFR 408.11 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... STANDARDS CANNED AND PRESERVED SEAFOOD PROCESSING POINT SOURCE CATEGORY Farm-Raised Catfish Processing... apply to this subpart. (b) The term oil and grease shall mean those components of a waste water amenable to measurement by the method described in Methods for Chemical Analysis of Water and Wastes, 1971...

  14. [Power, interdependence and complementarity in hospital work: an analysis from the nursing point of view].

    PubMed

    Lopes, M J

    1997-01-01

    This essay intends to discuss recent transformation both to hospital work and nursing work specifically. Analysis privilege inter and intra relations with multidisciplinary teams which is constituted of practices on the therapeutic process present in hospital space-time.

  15. Numerical analysis of stress effects on Frank loop evolution during irradiation in austenitic Fe&z.sbnd;Cr&z.sbnd;Ni alloy

    NASA Astrophysics Data System (ADS)

    Tanigawa, Hiroyasu; Katoh, Yutai; Kohyama, Akira

    1995-08-01

    Effects of applied stress on early stages of interstitial type Frank loop evolution were investigated by both numerical calculation and irradiation experiments. The final objective of this research is to propose a comprehensive model of complex stress effects on microstructural evolution under various conditions. In the experimental part of this work, the microstructural analysis revealed that the differences in resolved normal stress caused those in the nucleation rates of Frank loops on {111} crystallographic family planes, and that with increasing external applied stress the total nucleation rate of Frank loops was increased. A numerical calculation was carried out primarily to evaluate the validity of models of stress effects on nucleation processes of Frank loop evolution. The calculation stands on rate equuations which describe evolution of point defects, small points defect clusters and Frank loops. The rate equations of Frank loop evolution were formulated for {111} planes, considering effects of resolved normal stress to clustering processes of small point defects and growth processes of Frank loops, separately. The experimental results and the predictions from the numerical calculation qualitatively coincided well with each other.

  16. Point process methods in epidemiology: application to the analysis of human immunodeficiency virus/acquired immunodeficiency syndrome mortality in urban areas.

    PubMed

    Quesada, Jose Antonio; Melchor, Inmaculada; Nolasco, Andreu

    2017-05-26

    The analysis of spatio-temporal patterns of disease or death in urban areas has been developed mainly from the ecological studies approach. These designs may have some limitations like the ecological fallacy and instability with few cases. The objective of this study was to apply the point process methodology, as a complement to that of aggregated data, to study HIV/AIDS mortality in men in the city of Alicante (Spain). A case-control study in residents in the city during the period 2004-2011 was designed. Cases were men who died from HIV/AIDS and controls represented the general population, matched by age to cases. The risk surfaces of death over the city were estimated using the log-risk function of intensities, and we contrasted their temporal variations over the two periods. High risk significant areas of death by HIV/AIDS, which coincide with the most deprived areas in the city, were detected. Significant spatial change of the areas at risk between the periods studied was not detected. The point process methodology is a useful tool to analyse the patterns of death by HIV/AIDS in urban areas.

  17. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  18. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.170... point in the production process after which no further chemical reactions designed to produce or purify...

  19. Point Cloud Analysis for Conservation and Enhancement of Modernist Architecture

    NASA Astrophysics Data System (ADS)

    Balzani, M.; Maietti, F.; Mugayar Kühl, B.

    2017-02-01

    Documentation of cultural assets through improved acquisition processes for advanced 3D modelling is one of the main challenges to be faced in order to address, through digital representation, advanced analysis on shape, appearance and conservation condition of cultural heritage. 3D modelling can originate new avenues in the way tangible cultural heritage is studied, visualized, curated, displayed and monitored, improving key features such as analysis and visualization of material degradation and state of conservation. An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  20. An analysis of the least-squares problem for the DSN systematic pointing error model

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.

    1991-01-01

    A systematic pointing error model is used to calibrate antennas in the Deep Space Network. The least squares problem is described and analyzed along with the solution methods used to determine the model's parameters. Specifically studied are the rank degeneracy problems resulting from beam pointing error measurement sets that incorporate inadequate sky coverage. A least squares parameter subset selection method is described and its applicability to the systematic error modeling process is demonstrated on Voyager 2 measurement distribution.

  1. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  3. Algorithms used in the Airborne Lidar Processing System (ALPS)

    USGS Publications Warehouse

    Nagle, David B.; Wright, C. Wayne

    2016-05-23

    The Airborne Lidar Processing System (ALPS) analyzes Experimental Advanced Airborne Research Lidar (EAARL) data—digitized laser-return waveforms, position, and attitude data—to derive point clouds of target surfaces. A full-waveform airborne lidar system, the EAARL seamlessly and simultaneously collects mixed environment data, including submerged, sub-aerial bare earth, and vegetation-covered topographies.ALPS uses three waveform target-detection algorithms to determine target positions within a given waveform: centroid analysis, leading edge detection, and bottom detection using water-column backscatter modeling. The centroid analysis algorithm detects opaque hard surfaces. The leading edge algorithm detects topography beneath vegetation and shallow, submerged topography. The bottom detection algorithm uses water-column backscatter modeling for deeper submerged topography in turbid water.The report describes slant range calculations and explains how ALPS uses laser range and orientation measurements to project measurement points into the Universal Transverse Mercator coordinate system. Parameters used for coordinate transformations in ALPS are described, as are Interactive Data Language-based methods for gridding EAARL point cloud data to derive digital elevation models. Noise reduction in point clouds through use of a random consensus filter is explained, and detailed pseudocode, mathematical equations, and Yorick source code accompany the report.

  4. Exact Performance Analysis of Two Distributed Processes with Multiple Synchronization Points.

    DTIC Science & Technology

    1987-05-01

    number of processes with straight-line sequences of semaphore operations . We use the geometric model for performance analysis, in contrast to proving...distribution unlimited. 4. PERFORMING’*ORGANIZATION REPORT NUMBERS) 5. MONITORING ORGANIZATION REPORT NUMB CS-TR-1845 6a. NAME OF PERFORMING ORGANIZATION 6b...OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIO U University of Maryland (If applicable) Office of Naval Research N/A 6c. ADDRESS (City, State, and

  5. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    PubMed

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  6. Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations

    ERIC Educational Resources Information Center

    O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John

    2009-01-01

    Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…

  7. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  8. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  9. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  10. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS), phase 1

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The large-signal behaviors of a regulator depend largely on the type of power circuit topology and control. Thus, for maximum flexibility, it is best to develop models for each functional block a independent modules. A regulator can then be configured by collecting appropriate pre-defined modules for each functional block. In order to complete the component model generation for a comprehensive spacecraft power system, the following modules were developed: solar array switching unit and control; shunt regulators; and battery discharger. The capability of each module is demonstrated using a simplified Direct Energy Transfer (DET) system. Large-signal behaviors of solar array power systems were analyzed. Stability of the solar array system operating points with a nonlinear load is analyzed. The state-plane analysis illustrates trajectories of the system operating point under various conditions. Stability and transient responses of the system operating near the solar array's maximum power point are also analyzed. The solar array system mode of operation is described using the DET spacecraft power system. The DET system is simulated for various operating conditions. Transfer of the software program CAMAPPS (Computer Aided Modeling and Analysis of Power Processing Systems) to NASA/GSFC (Goddard Space Flight Center) was accomplished.

  11. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  12. Trade-off analysis of modes of data handling for earth resources (ERS), volume 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Data handling requirements are reviewed for earth observation missions along with likely technology advances. Parametric techniques for synthesizing potential systems are developed. Major tasks include: (1) review of the sensors under development and extensions of or improvements in these sensors; (2) development of mission models for missions spanning land, ocean, and atmosphere observations; (3) summary of data handling requirements including the frequency of coverage, timeliness of dissemination, and geographic relationships between points of collection and points of dissemination; (4) review of data routing to establish ways of getting data from the collection point to the user; (5) on-board data processing; (6) communications link; and (7) ground data processing. A detailed synthesis of three specific missions is included.

  13. Tracking prominent points in image sequences

    NASA Astrophysics Data System (ADS)

    Hahn, Michael

    1994-03-01

    Measuring image motion and inferring scene geometry and camera motion are main aspects of image sequence analysis. The determination of image motion and the structure-from-motion problem are tasks that can be addressed independently or in cooperative processes. In this paper we focus on tracking prominent points. High stability, reliability, and accuracy are criteria for the extraction of prominent points. This implies that tracking should work quite well with those features; unfortunately, the reality looks quite different. In the experimental investigations we processed a long sequence of 128 images. This mono sequence is taken in an outdoor environment at the experimental field of Mercedes Benz in Rastatt. Different tracking schemes are explored and the results with respect to stability and quality are reported.

  14. A 640-MHz 32-megachannel real-time polyphase-FFT spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Garyantes, M. F.; Grimm, M. J.; Charny, B.

    1991-01-01

    A polyphase fast Fourier transform (FFT) spectrum analyzer being designed for NASA's Search for Extraterrestrial Intelligence (SETI) Sky Survey at the Jet Propulsion Laboratory is described. By replacing the time domain multiplicative window preprocessing with polyphase filter processing, much of the processing loss of windowed FFTs can be eliminated. Polyphase coefficient memory costs are minimized by effective use of run length compression. Finite word length effects are analyzed, producing a balanced system with 8 bit inputs, 16 bit fixed point polyphase arithmetic, and 24 bit fixed point FFT arithmetic. Fixed point renormalization midway through the computation is seen to be naturally accommodated by the matrix FFT algorithm proposed. Simulation results validate the finite word length arithmetic analysis and the renormalization technique.

  15. Image detection and compression for memory efficient system analysis

    NASA Astrophysics Data System (ADS)

    Bayraktar, Mustafa

    2015-02-01

    The advances in digital signal processing have been progressing towards efficient use of memory and processing. Both of these factors can be utilized efficiently by using feasible techniques of image storage by computing the minimum information of image which will enhance computation in later processes. Scale Invariant Feature Transform (SIFT) can be utilized to estimate and retrieve of an image. In computer vision, SIFT can be implemented to recognize the image by comparing its key features from SIFT saved key point descriptors. The main advantage of SIFT is that it doesn't only remove the redundant information from an image but also reduces the key points by matching their orientation and adding them together in different windows of image [1]. Another key property of this approach is that it works on highly contrasted images more efficiently because it`s design is based on collecting key points from the contrast shades of image.

  16. Spectrum of classes of point emitters of electromagnetic wave fields.

    PubMed

    Castañeda, Román

    2016-09-01

    The spectrum of classes of point emitters has been introduced as a numerical tool suitable for the design, analysis, and synthesis of non-paraxial optical fields in arbitrary states of spatial coherence. In this paper, the polarization state of planar electromagnetic wave fields is included in the spectrum of classes, thus increasing its modeling capabilities. In this context, optical processing is realized as a filtering on the spectrum of classes of point emitters, performed by the complex degree of spatial coherence and the two-point correlation of polarization, which could be implemented dynamically by using programmable optical devices.

  17. Research on an uplink carrier sense multiple access algorithm of large indoor visible light communication networks based on an optical hard core point process.

    PubMed

    Nan, Zhufen; Chi, Xuefen

    2016-12-20

    The IEEE 802.15.7 protocol suggests that it could coordinate the channel access process based on the competitive method of carrier sensing. However, the directionality of light and randomness of diffuse reflection would give rise to a serious imperfect carrier sense (ICS) problem [e.g., hidden node (HN) problem and exposed node (EN) problem], which brings great challenges in realizing the optical carrier sense multiple access (CSMA) mechanism. In this paper, the carrier sense process implemented by diffuse reflection light is modeled as the choice of independent sets. We establish an ICS model with the presence of ENs and HNs for the multi-point to multi-point visible light communication (VLC) uplink communications system. Considering the severe optical ICS problem, an optical hard core point process (OHCPP) is developed, which characterizes the optical CSMA for the indoor VLC uplink communications system. Due to the limited coverage of the transmitted optical signal, in our OHCPP, the ENs within the transmitters' carrier sense region could be retained provided that they could not corrupt the ongoing communications. Moreover, because of the directionality of both light emitting diode (LED) transmitters and receivers, theoretical analysis of the HN problem becomes difficult. In this paper, we derive the closed-form expression for approximating the outage probability and transmission capacity of VLC networks with the presence of HNs and ENs. Simulation results validate the analysis and also show the existence of an optimal physical carrier-sensing threshold that maximizes the transmission capacity for a given emission angle of LED.

  18. The Iqmulus Urban Showcase: Automatic Tree Classification and Identification in Huge Mobile Mapping Point Clouds

    NASA Astrophysics Data System (ADS)

    Böhm, J.; Bredif, M.; Gierlinger, T.; Krämer, M.; Lindenberg, R.; Liu, K.; Michel, F.; Sirmacek, B.

    2016-06-01

    Current 3D data capturing as implemented on for example airborne or mobile laser scanning systems is able to efficiently sample the surface of a city by billions of unselective points during one working day. What is still difficult is to extract and visualize meaningful information hidden in these point clouds with the same efficiency. This is where the FP7 IQmulus project enters the scene. IQmulus is an interactive facility for processing and visualizing big spatial data. In this study the potential of IQmulus is demonstrated on a laser mobile mapping point cloud of 1 billion points sampling ~ 10 km of street environment in Toulouse, France. After the data is uploaded to the IQmulus Hadoop Distributed File System, a workflow is defined by the user consisting of retiling the data followed by a PCA driven local dimensionality analysis, which runs efficiently on the IQmulus cloud facility using a Spark implementation. Points scattering in 3 directions are clustered in the tree class, and are separated next into individual trees. Five hours of processing at the 12 node computing cluster results in the automatic identification of 4000+ urban trees. Visualization of the results in the IQmulus fat client helps users to appreciate the results, and developers to identify remaining flaws in the processing workflow.

  19. Implementation of Steiner point of fuzzy set.

    PubMed

    Liang, Jiuzhen; Wang, Dejiang

    2014-01-01

    This paper deals with the implementation of Steiner point of fuzzy set. Some definitions and properties of Steiner point are investigated and extended to fuzzy set. This paper focuses on establishing efficient methods to compute Steiner point of fuzzy set. Two strategies of computing Steiner point of fuzzy set are proposed. One is called linear combination of Steiner points computed by a series of crisp α-cut sets of the fuzzy set. The other is an approximate method, which is trying to find the optimal α-cut set approaching the fuzzy set. Stability analysis of Steiner point of fuzzy set is also studied. Some experiments on image processing are given, in which the two methods are applied for implementing Steiner point of fuzzy image, and both strategies show their own advantages in computing Steiner point of fuzzy set.

  20. Specifications of a Simulation Model for a Local Area Network Design in Support of a Stock Point Logistics Integrated Communication Environment (SPLICE).

    DTIC Science & Technology

    1983-06-01

    constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in

  1. 300 Area treated effluent disposal facility sampling schedule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1994-10-11

    This document is the interface between the 300 Area Liquid Effluent Process Engineering (LEPE) group and the Waste Sampling and Characterization Facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  2. Neurosurgery certification in member societies of the World Federation of Neurosurgical Societies: Asia.

    PubMed

    Gasco, Jaime; Braun, Jonathan D; McCutcheon, Ian E; Black, Peter M

    2011-01-01

    To objectively compare the complexity and diversity of the certification process in neurological surgery in member societies of the World Federation of Neurosurgical Societies. This study centers in continental Asia. We provide here an analysis based on the responses provided to a 13-item survey. The data received were analyzed, and three Regional Complexity Scores (RCS) were designed. To compare national board experience, eligibility requirements for access to the certification process, and the obligatory nature of the examinations, an RCS-Organizational score was created (20 points maximum). To analyze the complexity of the examination, an RCS-Components score was designed (20 points maximum). The sum of both is presented in a Global RCS score. Only those countries that responded to the survey and presented nationwide homogeneity in the conduction of neurosurgery examinations could be included within the scoring system. In addition, a descriptive summary of the certification process per responding society is also provided. On the basis of the data provided by our RCS system, the highest global RCS was achieved by South Korea and Malaysia (21/40 points) followed by the joint examination of Singapore and Hong-Kong (FRCS-Ed) (20/40 points), Japan (17/40 points), the Philippines (15/40 points), and Taiwan (13 points). The experience from these leading countries should be of value to all countries within Asia. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Succinonitrile Purification Facility

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The Succinonitrile (SCN) Purification Facility provides succinonitrile and succinonitrile alloys to several NRA selected investigations for flight and ground research at various levels of purity. The purification process employed includes both distillation and zone refining. Once the appropriate purification process is completed, samples are characterized to determine the liquidus and/or solidus temperature, which is then related to sample purity. The lab has various methods for measuring these temperatures with accuracies in the milliKelvin to tenths of milliKelvin range. The ultra-pure SCN produced in our facility is indistinguishable from the standard material provided by NIST to well within the stated +/- 1.5mK of the NIST triple point cells. In addition to delivering material to various investigations, our current activities include process improvement, characterization of impurities and triple point cell design and development. The purification process is being evaluated for each of the four vendors to determine the efficacy of each purification step. We are also collecting samples of the remainder from distillation and zone refining for analysis of the constituent impurities. The large triple point cells developed will contain SCN with a melting point of 58.0642 C +/- 1.5mK for use as a calibration standard for Standard Platinum Resistance Thermometers (SPRTs).

  4. Human intronless genes: Functional groups, associated diseases, evolution, and mRNA processing in absence of splicing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grzybowska, Ewa A., E-mail: ewag@coi.waw.pl

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Functional characteristics of intronless genes (IGs). Black-Right-Pointing-Pointer Diseases associated with IGs. Black-Right-Pointing-Pointer Origin and evolution of IGs. Black-Right-Pointing-Pointer mRNA processing without splicing. -- Abstract: Intronless genes (IGs) constitute approximately 3% of the human genome. Human IGs are essentially different in evolution and functionality from the IGs of unicellular eukaryotes, which represent the majority in their genomes. Functional analysis of IGs has revealed a massive over-representation of signal transduction genes and genes encoding regulatory proteins important for growth, proliferation, and development. IGs also often display tissue-specific expression, usually in the nervous system and testis. These characteristics translate into IG-associatedmore » diseases, mainly neuropathies, developmental disorders, and cancer. IGs represent recent additions to the genome, created mostly by retroposition of processed mRNAs with retained functionality. Processing, nuclear export, and translation of these mRNAs should be hampered dramatically by the lack of splice factors, which normally tightly cover mature transcripts and govern their fate. However, natural IGs manage to maintain satisfactory expression levels. Different mechanisms by which IGs solve the problem of mRNA processing and nuclear export are discussed here, along with their possible impact on reporter studies.« less

  5. High‐resolution trench photomosaics from image‐based modeling: Workflow and error analysis

    USGS Publications Warehouse

    Reitman, Nadine G.; Bennett, Scott E. K.; Gold, Ryan D.; Briggs, Richard; Duross, Christopher

    2015-01-01

    Photomosaics are commonly used to construct maps of paleoseismic trench exposures, but the conventional process of manually using image‐editing software is time consuming and produces undesirable artifacts and distortions. Herein, we document and evaluate the application of image‐based modeling (IBM) for creating photomosaics and 3D models of paleoseismic trench exposures, illustrated with a case‐study trench across the Wasatch fault in Alpine, Utah. Our results include a structure‐from‐motion workflow for the semiautomated creation of seamless, high‐resolution photomosaics designed for rapid implementation in a field setting. Compared with conventional manual methods, the IBM photomosaic method provides a more accurate, continuous, and detailed record of paleoseismic trench exposures in approximately half the processing time and 15%–20% of the user input time. Our error analysis quantifies the effect of the number and spatial distribution of control points on model accuracy. For this case study, an ∼87  m2 exposure of a benched trench photographed at viewing distances of 1.5–7 m yields a model with <2  cm root mean square error (rmse) with as few as six control points. Rmse decreases as more control points are implemented, but the gains in accuracy are minimal beyond 12 control points. Spreading control points throughout the target area helps to minimize error. We propose that 3D digital models and corresponding photomosaics should be standard practice in paleoseismic exposure archiving. The error analysis serves as a guide for future investigations that seek balance between speed and accuracy during photomosaic and 3D model construction.

  6. Automatic Target Recognition Classification System Evaluation Methodology

    DTIC Science & Technology

    2002-09-01

    Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in

  7. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  8. Advancing School-Based Interventions through Economic Analysis

    ERIC Educational Resources Information Center

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  9. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357] Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food...

  10. The stability analysis of the nutrition restricted dynamic model of the microalgae biomass growth

    NASA Astrophysics Data System (ADS)

    Ratianingsih, R.; Fitriani, Nacong, N.; Resnawati, Mardlijah, Widodo, B.

    2018-03-01

    The biomass production is very essential in microalgae farming such that its growth rate is very important to be determined. This paper proposes the dynamics model of it that restricted by its nutrition. The model is developed by considers some related processes that are photosynthesis, respiration, nutrition absorption, stabilization, lipid synthesis and CO2 mobilization. The stability of the dynamical system that represents the processes is analyzed using the Jacobian matrix of the linearized system in the neighborhood of its critical point. There is a lipid formation threshold needed to require its existence. In such case, the absorption rate of respiration process has to be inversely proportional to the absorption rate of CO2 due to photosynthesis process. The Pontryagin minimal principal also shows that there are some requirements needed to have a stable critical point, such as the rate of CO2 released rate, due to the stabilization process that is restricted by 50%, and the threshold of its shifted critical point. In case of the rate of CO2 released rate due to the photosynthesis process is restricted in such interval; the stability of the model at the critical point could not be satisfied anymore. The simulation shows that the external nutrition plays a role in glucose formation such that sufficient for the biomass growth and the lipid production.

  11. Analysis of residual stress state in sheet metal parts processed by single point incremental forming

    NASA Astrophysics Data System (ADS)

    Maaß, F.; Gies, S.; Dobecki, M.; Brömmelhoff, K.; Tekkaya, A. E.; Reimers, W.

    2018-05-01

    The mechanical properties of formed metal components are highly affected by the prevailing residual stress state. A selective induction of residual compressive stresses in the component, can improve the product properties such as the fatigue strength. By means of single point incremental forming (SPIF), the residual stress state can be influenced by adjusting the process parameters during the manufacturing process. To achieve a fundamental understanding of the residual stress formation caused by the SPIF process, a valid numerical process model is essential. Within the scope of this paper the significance of kinematic hardening effects on the determined residual stress state is presented based on numerical simulations. The effect of the unclamping step after the manufacturing process is also analyzed. An average deviation of the residual stress amplitudes in the clamped and unclamped condition of 18 % reveals, that the unclamping step needs to be considered to reach a high numerical prediction quality.

  12. Transformative Processes in Marriage: An Analysis of Emerging Trends

    ERIC Educational Resources Information Center

    Fincham, Frank D.; Stanley, Scott M.; Beach, Steven R. H.

    2007-01-01

    The study of conflict has dominated psychological research on marriage. This article documents its move from center stage, outlining how a broader canvas accommodates a richer picture of marriage. A brief sampling of new constructs such as forgiveness and sacrifice points to an organizing theme of transformative processes in emerging marital…

  13. Morphological Processing of Ultraviolet Emissions of Electrical Corona Discharge for Analysis and Diagnostic Use

    NASA Technical Reports Server (NTRS)

    Schubert, Matthew R.; Moore, Andrew J.

    2015-01-01

    Electron cascades from electrical discharge produce secondary emissions from atmospheric plasma in the ultraviolet band. For a single point of discharge, these emissions exhibit a stereotypical discharge morphology, with latent information about the discharge location. Morphological processing can uncover the location and therefore can have diagnostic utility.

  14. Morphological processing of ultraviolet emissions of electrical corona discharge for analysis and diagnostic use.

    PubMed

    Schubert, Matthew; Moore, Andrew J

    2016-03-01

    Electron cascades from electrical discharge produce secondary emissions from atmospheric plasma in the ultraviolet band. For a single point of discharge, these emissions exhibit a stereotypical discharge morphology, with latent information about the discharge location. Morphological processing can uncover the location and therefore have diagnostic utility.

  15. [Report quality evaluation of systematic review or Meta-analysis published in China Journal of Chinese Materia Medica].

    PubMed

    Zhang, Yan; Yu, Dan-Dan; Cui, De-Hua; Liao, Xing; Guo, Hua

    2018-03-01

    To evaluate the report quality of intervention-related systematic reviews or Meta-analysis published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia Medica webpages to collect intervention-related systematic reviews or Meta-analysis since the first issue of the magazine. A total of 40 systematic reviews or Meta-analysis reports were included, including one network Meta-analysis. According to the PRISMA statement published in 2009, the report quality of the systematic reviews or Meta-analysis was evaluated. According to the results, 3 had the low quality, 30 had the medium quality, and 7 had the high quality. The average score for all of items was 30 points (21-30.5 points for the medium quality). The 17 high-quality (31-40 points) report items were title, rationale, objectives, information sources, study selection, data collection process, data items, risk of bias in individual studies, summary measures, risk of bias across studies, study selection, study characteristics, risk of bias within studies, results of individual studies, synthesis of results, risk of bias across studies and funding; the 4 medium-quality (21-30.5 points) reporting items were eligibility criteria, search, limitations and conclusions; and the 6 low-quality (<=20.5 points) reporting items were structured summary, protocol and registration, synthesis of results, additional analysis (No.16), additional analysis (No.23) and summary of evidence. Through the analysis, it is found that the report quality of intervention-related systematic reviews or Meta-analysis published in China Journal of Chinese Materia Medica is medium, and it is necessary to improve the quality standard of the report. Copyright© by the Chinese Pharmaceutical Association.

  16. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  17. Evaluation of methods for rapid determination of freezing point of aviation fuels

    NASA Technical Reports Server (NTRS)

    Mathiprakasam, B.

    1982-01-01

    Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.

  18. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  19. In Vivo Assessment of Protease Dynamics in Cutaneous Wound Healing by Degradomics Analysis of Porcine Wound Exudates*

    PubMed Central

    Sabino, Fabio; Hermes, Olivia; Egli, Fabian E.; Kockmann, Tobias; Schlage, Pascal; Croizat, Pierre; Kizhakkedathu, Jayachandran N.; Smola, Hans; auf dem Keller, Ulrich

    2015-01-01

    Proteases control complex tissue responses by modulating inflammation, cell proliferation and migration, and matrix remodeling. All these processes are orchestrated in cutaneous wound healing to restore the skin's barrier function upon injury. Altered protease activity has been implicated in the pathogenesis of healing impairments, and proteases are important targets in diagnosis and therapy of this pathology. Global assessment of proteolysis at critical turning points after injury will define crucial events in acute healing that might be disturbed in healing disorders. As optimal biospecimens, wound exudates contain an ideal proteome to detect extracellular proteolytic events, are noninvasively accessible, and can be collected at multiple time points along the healing process from the same wound in the clinics. In this study, we applied multiplexed Terminal Amine Isotopic Labeling of Substrates (TAILS) to globally assess proteolysis in early phases of cutaneous wound healing. By quantitative analysis of proteins and protein N termini in wound fluids from a clinically relevant pig wound model, we identified more than 650 proteins and discerned major healing phases through distinctive abundance clustering of markers of inflammation, granulation tissue formation, and re-epithelialization. TAILS revealed a high degree of proteolysis at all time points after injury by detecting almost 1300 N-terminal peptides in ∼450 proteins. Quantitative positional proteomics mapped pivotal interdependent processing events in the blood coagulation and complement cascades, temporally discerned clotting and fibrinolysis during the healing process, and detected processing of complement C3 at distinct time points after wounding and by different proteases. Exploiting data on primary cleavage specificities, we related candidate proteases to cleavage events and revealed processing of the integrin adapter protein kindlin-3 by caspase-3, generating new hypotheses for protease-substrate relations in the healing skin wound in vivo. The data have been deposited to the ProteomeXchange Consortium with identifier PXD001198. PMID:25516628

  20. 300 Area treated effluent disposal facility sampling schedule. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1995-03-28

    This document is the interface between the 300 Area liquid effluent process engineering (LEPE) group and the waste sampling and characterization facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  1. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    NASA Astrophysics Data System (ADS)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  2. Efficient Open Source Lidar for Desktop Users

    NASA Astrophysics Data System (ADS)

    Flanagan, Jacob P.

    Lidar --- Light Detection and Ranging --- is a remote sensing technology that utilizes a device similar to a rangefinder to determine a distance to a target. A laser pulse is shot at an object and the time it takes for the pulse to return in measured. The distance to the object is easily calculated using the speed property of light. For lidar, this laser is moved (primarily in a rotational movement usually accompanied by a translational movement) and records the distances to objects several thousands of times per second. From this, a 3 dimensional structure can be procured in the form of a point cloud. A point cloud is a collection of 3 dimensional points with at least an x, a y and a z attribute. These 3 attributes represent the position of a single point in 3 dimensional space. Other attributes can be associated with the points that include properties such as the intensity of the return pulse, the color of the target or even the time the point was recorded. Another very useful, post processed attribute is point classification where a point is associated with the type of object the point represents (i.e. ground.). Lidar has gained popularity and advancements in the technology has made its collection easier and cheaper creating larger and denser datasets. The need to handle this data in a more efficiently manner has become a necessity; The processing, visualizing or even simply loading lidar can be computationally intensive due to its very large size. Standard remote sensing and geographical information systems (GIS) software (ENVI, ArcGIS, etc.) was not originally built for optimized point cloud processing and its implementation is an afterthought and therefore inefficient. Newer, more optimized software for point cloud processing (QTModeler, TopoDOT, etc.) usually lack more advanced processing tools, requires higher end computers and are very costly. Existing open source lidar approaches the loading and processing of lidar in an iterative fashion that requires implementing batch coding and processing time that could take months for a standard lidar dataset. This project attempts to build a software with the best approach for creating, importing and exporting, manipulating and processing lidar, especially in the environmental field. Development of this software is described in 3 sections - (1) explanation of the search methods for efficiently extracting the "area of interest" (AOI) data from disk (file space), (2) using file space (for storage), budgeting memory space (for efficient processing) and moving between the two, and (3) method development for creating lidar products (usually raster based) used in environmental modeling and analysis (i.e.: hydrology feature extraction, geomorphological studies, ecology modeling, etc.).

  3. Making Sense of Sensemaking: Requirements of a Cognitive Analysis to Support C2 Decision Support System Design

    DTIC Science & Technology

    2006-06-01

    heart of a distinction within the CSE community with respect to the differences between Cognitive Task Analysis (CTA) and Cognitive Work Analysis...Wesley. Pirolli, P. and Card, S. (2005). The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis . In...D. D., and Elm, W. C. (2000). Cognitive task analysis as bootstrapping multiple converging techniques. In Schraagen, Chipman, and Shalin (Eds

  4. The potential of cloud point system as a novel two-phase partitioning system for biotransformation.

    PubMed

    Wang, Zhilong

    2007-05-01

    Although the extractive biotransformation in two-phase partitioning systems have been studied extensively, such as the water-organic solvent two-phase system, the aqueous two-phase system, the reverse micelle system, and the room temperature ionic liquid, etc., this has not yet resulted in a widespread industrial application. Based on the discussion of the main obstacles, an exploitation of a cloud point system, which has already been applied in a separation field known as a cloud point extraction, as a novel two-phase partitioning system for biotransformation, is reviewed by analysis of some topical examples. At the end of the review, the process control and downstream processing in the application of the novel two-phase partitioning system for biotransformation are also briefly discussed.

  5. Topological data analysis of contagion maps for examining spreading processes on networks.

    PubMed

    Taylor, Dane; Klimm, Florian; Harrington, Heather A; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A; Mucha, Peter J

    2015-07-21

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges-for example, due to airline transportation or communication media-allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  6. Topological data analysis of contagion maps for examining spreading processes on networks

    NASA Astrophysics Data System (ADS)

    Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.

    2015-07-01

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges--for example, due to airline transportation or communication media--allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct `contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  7. Predicting Bradycardia in Preterm Infants Using Point Process Analysis of Heart Rate.

    PubMed

    Gee, Alan H; Barbieri, Riccardo; Paydarfar, David; Indic, Premananda

    2017-09-01

    Episodes of bradycardia are common and recur sporadically in preterm infants, posing a threat to the developing brain and other vital organs. We hypothesize that bradycardias are a result of transient temporal destabilization of the cardiac autonomic control system and that fluctuations in the heart rate signal might contain information that precedes bradycardia. We investigate infant heart rate fluctuations with a novel application of point process theory. In ten preterm infants, we estimate instantaneous linear measures of the heart rate signal, use these measures to extract statistical features of bradycardia, and propose a simplistic framework for prediction of bradycardia. We present the performance of a prediction algorithm using instantaneous linear measures (mean area under the curve = 0.79 ± 0.018) for over 440 bradycardia events. The algorithm achieves an average forecast time of 116 s prior to bradycardia onset (FPR = 0.15). Our analysis reveals that increased variance in the heart rate signal is a precursor of severe bradycardia. This increase in variance is associated with an increase in power from low content dynamics in the LF band (0.04-0.2 Hz) and lower multiscale entropy values prior to bradycardia. Point process analysis of the heartbeat time series reveals instantaneous measures that can be used to predict infant bradycardia prior to onset. Our findings are relevant to risk stratification, predictive monitoring, and implementation of preventative strategies for reducing morbidity and mortality associated with bradycardia in neonatal intensive care units.

  8. A Doppler centroid estimation algorithm for SAR systems optimized for the quasi-homogeneous source

    NASA Technical Reports Server (NTRS)

    Jin, Michael Y.

    1989-01-01

    Radar signal processing applications frequently require an estimate of the Doppler centroid of a received signal. The Doppler centroid estimate is required for synthetic aperture radar (SAR) processing. It is also required for some applications involving target motion estimation and antenna pointing direction estimation. In some cases, the Doppler centroid can be accurately estimated based on available information regarding the terrain topography, the relative motion between the sensor and the terrain, and the antenna pointing direction. Often, the accuracy of the Doppler centroid estimate can be improved by analyzing the characteristics of the received SAR signal. This kind of signal processing is also referred to as clutterlock processing. A Doppler centroid estimation (DCE) algorithm is described which contains a linear estimator optimized for the type of terrain surface that can be modeled by a quasi-homogeneous source (QHS). Information on the following topics is presented: (1) an introduction to the theory of Doppler centroid estimation; (2) analysis of the performance characteristics of previously reported DCE algorithms; (3) comparison of these analysis results with experimental results; (4) a description and performance analysis of a Doppler centroid estimator which is optimized for a QHS; and (5) comparison of the performance of the optimal QHS Doppler centroid estimator with that of previously reported methods.

  9. Tipping point analysis of a large ocean ambient sound record

    NASA Astrophysics Data System (ADS)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  10. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  11. Filtering Raw Terrestrial Laser Scanning Data for Efficient and Accurate Use in Geomorphologic Modeling

    NASA Astrophysics Data System (ADS)

    Gleason, M. J.; Pitlick, J.; Buttenfield, B. P.

    2011-12-01

    Terrestrial laser scanning (TLS) represents a new and particularly effective remote sensing technique for investigating geomorphologic processes. Unfortunately, TLS data are commonly characterized by extremely large volume, heterogeneous point distribution, and erroneous measurements, raising challenges for applied researchers. To facilitate efficient and accurate use of TLS in geomorphology, and to improve accessibility for TLS processing in commercial software environments, we are developing a filtering method for raw TLS data to: eliminate data redundancy; produce a more uniformly spaced dataset; remove erroneous measurements; and maintain the ability of the TLS dataset to accurately model terrain. Our method conducts local aggregation of raw TLS data using a 3-D search algorithm based on the geometrical expression of expected random errors in the data. This approach accounts for the estimated accuracy and precision limitations of the instruments and procedures used in data collection, thereby allowing for identification and removal of potential erroneous measurements prior to data aggregation. Initial tests of the proposed technique on a sample TLS point cloud required a modest processing time of approximately 100 minutes to reduce dataset volume over 90 percent (from 12,380,074 to 1,145,705 points). Preliminary analysis of the filtered point cloud revealed substantial improvement in homogeneity of point distribution and minimal degradation of derived terrain models. We will test the method on two independent TLS datasets collected in consecutive years along a non-vegetated reach of the North Fork Toutle River in Washington. We will evaluate the tool using various quantitative, qualitative, and statistical methods. The crux of this evaluation will include a bootstrapping analysis to test the ability of the filtered datasets to model the terrain at roughly the same accuracy as the raw datasets.

  12. Use of parallel computing in mass processing of laser data

    NASA Astrophysics Data System (ADS)

    Będkowski, J.; Bratuś, R.; Prochaska, M.; Rzonca, A.

    2015-12-01

    The first part of the paper includes a description of the rules used to generate the algorithm needed for the purpose of parallel computing and also discusses the origins of the idea of research on the use of graphics processors in large scale processing of laser scanning data. The next part of the paper includes the results of an efficiency assessment performed for an array of different processing options, all of which were substantially accelerated with parallel computing. The processing options were divided into the generation of orthophotos using point clouds, coloring of point clouds, transformations, and the generation of a regular grid, as well as advanced processes such as the detection of planes and edges, point cloud classification, and the analysis of data for the purpose of quality control. Most algorithms had to be formulated from scratch in the context of the requirements of parallel computing. A few of the algorithms were based on existing technology developed by the Dephos Software Company and then adapted to parallel computing in the course of this research study. Processing time was determined for each process employed for a typical quantity of data processed, which helped confirm the high efficiency of the solutions proposed and the applicability of parallel computing to the processing of laser scanning data. The high efficiency of parallel computing yields new opportunities in the creation and organization of processing methods for laser scanning data.

  13. Improving the quality of extracting dynamics from interspike intervals via a resampling approach

    NASA Astrophysics Data System (ADS)

    Pavlova, O. N.; Pavlov, A. N.

    2018-04-01

    We address the problem of improving the quality of characterizing chaotic dynamics based on point processes produced by different types of neuron models. Despite the presence of embedding theorems for non-uniformly sampled dynamical systems, the case of short data analysis requires additional attention because the selection of algorithmic parameters may have an essential influence on estimated measures. We consider how the preliminary processing of interspike intervals (ISIs) can increase the precision of computing the largest Lyapunov exponent (LE). We report general features of characterizing chaotic dynamics from point processes and show that independently of the selected mechanism for spike generation, the performed preprocessing reduces computation errors when dealing with a limited amount of data.

  14. Applicability Analysis of Cloth Simulation Filtering Algorithm for Mobile LIDAR Point Cloud

    NASA Astrophysics Data System (ADS)

    Cai, S.; Zhang, W.; Qi, J.; Wan, P.; Shao, J.; Shen, A.

    2018-04-01

    Classifying the original point clouds into ground and non-ground points is a key step in LiDAR (light detection and ranging) data post-processing. Cloth simulation filtering (CSF) algorithm, which based on a physical process, has been validated to be an accurate, automatic and easy-to-use algorithm for airborne LiDAR point cloud. As a new technique of three-dimensional data collection, the mobile laser scanning (MLS) has been gradually applied in various fields, such as reconstruction of digital terrain models (DTM), 3D building modeling and forest inventory and management. Compared with airborne LiDAR point cloud, there are some different features (such as point density feature, distribution feature and complexity feature) for mobile LiDAR point cloud. Some filtering algorithms for airborne LiDAR data were directly used in mobile LiDAR point cloud, but it did not give satisfactory results. In this paper, we explore the ability of the CSF algorithm for mobile LiDAR point cloud. Three samples with different shape of the terrain are selected to test the performance of this algorithm, which respectively yields total errors of 0.44 %, 0.77 % and1.20 %. Additionally, large area dataset is also tested to further validate the effectiveness of this algorithm, and results show that it can quickly and accurately separate point clouds into ground and non-ground points. In summary, this algorithm is efficient and reliable for mobile LiDAR point cloud.

  15. STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS

    DTIC Science & Technology

    2018-02-15

    23 9 Ground truth creation based on marked building feature points in two different views 50 frames apart in...between just two views , each row in the current figure represents a similar assessment however between one camera and all other cameras within the dataset...BA4S. While Fig. 44 depicted the epipolar lines for the point correspondences between just two views , the current figure represents a similar

  16. Traffic sign detection in MLS acquired point clouds for geometric and image-based semantic inventory

    NASA Astrophysics Data System (ADS)

    Soilán, Mario; Riveiro, Belén; Martínez-Sánchez, Joaquín; Arias, Pedro

    2016-04-01

    Nowadays, mobile laser scanning has become a valid technology for infrastructure inspection. This technology permits collecting accurate 3D point clouds of urban and road environments and the geometric and semantic analysis of data became an active research topic in the last years. This paper focuses on the detection of vertical traffic signs in 3D point clouds acquired by a LYNX Mobile Mapper system, comprised of laser scanning and RGB cameras. Each traffic sign is automatically detected in the LiDAR point cloud, and its main geometric parameters can be automatically extracted, therefore aiding the inventory process. Furthermore, the 3D position of traffic signs are reprojected on the 2D images, which are spatially and temporally synced with the point cloud. Image analysis allows for recognizing the traffic sign semantics using machine learning approaches. The presented method was tested in road and urban scenarios in Galicia (Spain). The recall results for traffic sign detection are close to 98%, and existing false positives can be easily filtered after point cloud projection. Finally, the lack of a large, publicly available Spanish traffic sign database is pointed out.

  17. The linkage between fluvial meander-belt morphodynamics and the depositional record improves paleoenvironmental interpretations, Western Interior Basin, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Durkin, P.; Hubbard, S. M.

    2016-12-01

    Enhanced stratigraphic interpretations are possible when linkages between morphodynamic processes and the depositional record are resolved. Recent studies of modern and ancient meander-belt deposits have emphasized morphodynamic processes that are commonly understated in the analysis of stratigraphic products, such as intra-point bar erosion and rotation, counter-point-bar (concave bank-bench) development and meander-bend abandonment. On a larger scale, longitudinal changes in meander-belt morphology and processes such as changes in meander-bend migration rate, channel-belt width/depth ratio and sinuosity have been observed as rivers flow through the tidal backwater zone. However, few studies have attempted to recognize the impact of the backwater zone in the stratigraphic record. We consider ancient meander-belt deposits of the Cretaceous McMurray Formation and document linkages between morphodynamic processes and their stratigraphic product to resolve more detailed paleoenvironmental interpretations. The ancient meander belt was characterized by paleochannels that were 600 m wide and up to 50 m deep, resolved in a particularly high quality subsurface dataset consisting of 600 km2 of high-quality 3-D seismic data and over 1000 wellbores. A 3-D geocellular model and reconstructed paleochannel migration patterns reveal the evolutionary history of seventeen individual meander belt elements, including point bars, counter point bars and their associated abandoned channel fills. At the meander-bend scale, intra-point-bar erosion surfaces bound accretion packages characterized by unique accretion directions, internal stratigraphic architecture and lithologic properties. Erosion surfaces and punctuated bar rotation are linked to upstream changes in channel planform geometry (meander cut-offs). We provide evidence for downstream translation and development of counter-point bars that formed in response to valley-edge and intra-meander-belt confinement. At the meander-belt scale, analysis of changes in morphology over time reveal a decrease in channel-belt width/thickness ratio and sinuosity, which we attribute to the landward migration of the paleo-backwater limit due to the oncoming and overlying transgression of the Cretaceous Boreal Sea into the Western Interior Basin.

  18. Fast Computation and Assessment Methods in Power System Analysis

    NASA Astrophysics Data System (ADS)

    Nagata, Masaki

    Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.

  19. Within-subject template estimation for unbiased longitudinal image analysis.

    PubMed

    Reuter, Martin; Schmansky, Nicholas J; Rosas, H Diana; Fischl, Bruce

    2012-07-16

    Longitudinal image analysis has become increasingly important in clinical studies of normal aging and neurodegenerative disorders. Furthermore, there is a growing appreciation of the potential utility of longitudinally acquired structural images and reliable image processing to evaluate disease modifying therapies. Challenges have been related to the variability that is inherent in the available cross-sectional processing tools, to the introduction of bias in longitudinal processing and to potential over-regularization. In this paper we introduce a novel longitudinal image processing framework, based on unbiased, robust, within-subject template creation, for automatic surface reconstruction and segmentation of brain MRI of arbitrarily many time points. We demonstrate that it is essential to treat all input images exactly the same as removing only interpolation asymmetries is not sufficient to remove processing bias. We successfully reduce variability and avoid over-regularization by initializing the processing in each time point with common information from the subject template. The presented results show a significant increase in precision and discrimination power while preserving the ability to detect large anatomical deviations; as such they hold great potential in clinical applications, e.g. allowing for smaller sample sizes or shorter trials to establish disease specific biomarkers or to quantify drug effects. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Interactions between Individual Differences, Treatments, and Structures in SLA

    ERIC Educational Resources Information Center

    DeKeyser, Robert

    2012-01-01

    For decades educational psychologists have bemoaned the black box approach of much research on learning, that is, the focus on product rather than process, and the absence of fine-grained analysis of the learning process in the individual. One way that progress has been made on this point in the last couple of decades is through cognitive…

  1. Developing Leaders of Character at the United States Military Academy: A Relational Developmental Systems Analysis

    ERIC Educational Resources Information Center

    Callina, Kristina Schmid; Ryan, Diane; Murray, Elise D.; Colby, Anne; Damon, William; Matthews, Michael; Lerner, Richard M.

    2017-01-01

    A paucity of literature exists on the processes of character development within diverse contexts. In this article, the authors use the United States Military Academy at West Point (USMA) as a sample case for understanding character development processes within an institution of higher education. The authors present a discussion of relational…

  2. Treatment of electronic waste to recover metal values using thermal plasma coupled with acid leaching - A response surface modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rath, Swagat S., E-mail: swagat.rath@gmail.com; Nayak, Pradeep; Mukherjee, P.S.

    2012-03-15

    Highlights: Black-Right-Pointing-Pointer Sentences/phrases were modified. Black-Right-Pointing-Pointer Necessary discussions for different figures were included. Black-Right-Pointing-Pointer More discussion have been included on the flue gas analysis. Black-Right-Pointing-Pointer Queries to both the reviewers have been given. - Abstract: The global crisis of the hazardous electronic waste (E-waste) is on the rise due to increasing usage and disposal of electronic devices. A process was developed to treat E-waste in an environmentally benign process. The process consisted of thermal plasma treatment followed by recovery of metal values through mineral acid leaching. In the thermal process, the E-waste was melted to recover the metal values asmore » a metallic mixture. The metallic mixture was subjected to acid leaching in presence of depolarizer. The leached liquor mainly contained copper as the other elements like Al and Fe were mostly in alloy form as per the XRD and phase diagram studies. Response surface model was used to optimize the conditions for leaching. More than 90% leaching efficiency at room temperature was observed for Cu, Ni and Co with HCl as the solvent, whereas Fe and Al showed less than 40% efficiency.« less

  3. Optimal Synthesis of Compliant Mechanisms using Subdivision and Commercial FEA (DETC2004-57497)

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Canfield, Stephen

    2004-01-01

    The field of distributed-compliance mechanisms has seen significant work in developing suitable topology optimization tools for their design. These optimal design tools have grown out of the techniques of structural optimization. This paper will build on the previous work in topology optimization and compliant mechanism design by proposing an alternative design space parameterization through control points and adding another step to the process, that of subdivision. The control points allow a specific design to be represented as a solid model during the optimization process. The process of subdivision creates an additional number of control points that help smooth the surface (for example a C(sup 2) continuous surface depending on the method of subdivision chosen) creating a manufacturable design free of some traditional numerical instabilities. Note that these additional control points do not add to the number of design parameters. This alternative parameterization and description as a solid model effectively and completely separates the design variables from the analysis variables during the optimization procedure. The motivation behind this work is to create an automated design tool from task definition to functional prototype created on a CNC or rapid-prototype machine. This paper will describe the proposed compliant mechanism design process and will demonstrate the procedure on several examples common in the literature.

  4. A novel automatic segmentation workflow of axial breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Besbes, Feten; Gargouri, Norhene; Damak, Alima; Sellami, Dorra

    2018-04-01

    In this paper we propose a novel process of a fully automatic breast tissue segmentation which is independent from expert calibration and contrast. The proposed algorithm is composed by two major steps. The first step consists in the detection of breast boundaries. It is based on image content analysis and Moore-Neighbour tracing algorithm. As a processing step, Otsu thresholding and neighbors algorithm are applied. Then, the external area of breast is removed to get an approximated breast region. The second preprocessing step is the delineation of the chest wall which is considered as the lowest cost path linking three key points; These points are located automatically at the breast. They are respectively, the left and right boundary points and the middle upper point placed at the sternum region using statistical method. For the minimum cost path search problem, we resolve it through Dijkstra algorithm. Evaluation results reveal the robustness of our process face to different breast densities, complex forms and challenging cases. In fact, the mean overlap between manual segmentation and automatic segmentation through our method is 96.5%. A comparative study shows that our proposed process is competitive and faster than existing methods. The segmentation of 120 slices with our method is achieved at least in 20.57+/-5.2s.

  5. SU-F-J-180: A Reference Data Set for Testing Two Dimension Registration Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dankwa, A; Castillo, E; Guerrero, T

    Purpose: To create and characterize a reference data set for testing image registration algorithms that transform portal image (PI) to digitally reconstructed radiograph (DRR). Methods: Anterior-posterior (AP) and Lateral (LAT) projection and DRR image pairs from nine cases representing four different anatomical sites (head and neck, thoracic, abdominal, and pelvis) were selected for this study. Five experts will perform manual registration by placing landmarks points (LMPs) on the DRR and finding their corresponding points on the PI using computer assisted manual point selection tool (CAMPST), a custom-made MATLAB software tool developed in house. The landmark selection process will be repeatedmore » on both the PI and the DRR in order to characterize inter- and -intra observer variations associated with the point selection process. Inter and an intra observer variation in LMPs was done using Bland-Altman (B&A) analysis and one-way analysis of variance. We set our limit such that the absolute value of the mean difference between the readings should not exceed 3mm. Later on in this project we will test different two dimension (2D) image registration algorithms and quantify the uncertainty associated with their registration. Results: Using one-way analysis of variance (ANOVA) there was no variations within the readers. When Bland-Altman analysis was used the variation within the readers was acceptable. The variation was higher in the PI compared to the DRR.ConclusionThe variation seen for the PI is because although the PI has a much better spatial resolution the poor resolution on the DRR makes it difficult to locate the actual corresponding anatomical feature on the PI. We hope this becomes more evident when all the readers complete the point selection. The reason for quantifying inter- and -intra observer variation tells us to what degree of accuracy a manual registration can be done. Research supported by William Beaumont Hospital Research Start Up Fund.« less

  6. Prevalence of Campylobacter and Salmonella species on farm, after transport, and at processing in specialty market poultry.

    PubMed

    McCrea, B A; Tonooka, K H; VanWorth, C; Boggs, C L; Atwill, E R; Schrader, J S

    2006-01-01

    The prevalence of Campylobacter and Salmonella spp. was determined from live bird to prepackaged carcass for 3 flocks from each of 6 types of California niche-market poultry. Commodities sampled included squab, quail, guinea fowl, duck, poussin (young chicken), and free-range broiler chickens. Campylobacter on-farm prevalence was lowest for squab, followed by guinea fowl, duck, quail, and free-range chickens. Poussin had the highest prevalence of Campylobacter. No Salmonella was isolated from guinea fowl or quail flocks. A few positive samples were observed in duck and squab, predominately of S. Typhimurium. Free-range and poussin chickens had the highest prevalence of Salmonella. Post-transport prevalence was not significantly higher than on-farm, except in free-range flocks, where a higher prevalence of positive chickens was found after 6 to 8 h holding before processing. In most cases, the prevalence of Campylobacter- and Salmonella-positive birds was lower on the final product than on-farm or during processing. Odds ratio analysis indicated that the risk of a positive final product carcass was not increased by the prevalence of a positive sample at an upstream point in the processing line, or by on-farm prevalence (i.e., none of the common sampling stations among the 6 commodities could be acknowledged as critical control points). This suggests that hazard analysis critical control point plans for Campylobacter and Salmonella control in the niche-market poultry commodities will need to be specifically determined for each species and each processing facility.

  7. Real-time monitoring and massive inversion of source parameters of very long period seismic signals: An application to Stromboli Volcano, Italy

    USGS Publications Warehouse

    Auger, E.; D'Auria, L.; Martini, M.; Chouet, B.; Dawson, P.

    2006-01-01

    We present a comprehensive processing tool for the real-time analysis of the source mechanism of very long period (VLP) seismic data based on waveform inversions performed in the frequency domain for a point source. A search for the source providing the best-fitting solution is conducted over a three-dimensional grid of assumed source locations, in which the Green's functions associated with each point source are calculated by finite differences using the reciprocal relation between source and receiver. Tests performed on 62 nodes of a Linux cluster indicate that the waveform inversion and search for the best-fitting signal over 100,000 point sources require roughly 30 s of processing time for a 2-min-long record. The procedure is applied to post-processing of a data archive and to continuous automatic inversion of real-time data at Stromboli, providing insights into different modes of degassing at this volcano. Copyright 2006 by the American Geophysical Union.

  8. Spatial analysis of groundwater levels using Fuzzy Logic and geostatistical tools

    NASA Astrophysics Data System (ADS)

    Theodoridou, P. G.; Varouchakis, E. A.; Karatzas, G. P.

    2017-12-01

    The spatial variability evaluation of the water table of an aquifer provides useful information in water resources management plans. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram is very important for the optimal method performance. This work compares three different criteria to assess the theoretical variogram that fits to the experimental one: the Least Squares Sum method, the Akaike Information Criterion and the Cressie's Indicator. Moreover, variable distance metrics such as the Euclidean, Minkowski, Manhattan, Canberra and Bray-Curtis are applied to calculate the distance between the observation and the prediction points, that affects both the variogram calculation and the Kriging estimator. A Fuzzy Logic System is then applied to define the appropriate neighbors for each estimation point used in the Kriging algorithm. The two criteria used during the Fuzzy Logic process are the distance between observation and estimation points and the groundwater level value at each observation point. The proposed techniques are applied to a data set of 250 hydraulic head measurements distributed over an alluvial aquifer. The analysis showed that the Power-law variogram model and Manhattan distance metric within ordinary kriging provide the best results when the comprehensive geostatistical analysis process is applied. On the other hand, the Fuzzy Logic approach leads to a Gaussian variogram model and significantly improves the estimation performance. The two different variogram models can be explained in terms of a fractional Brownian motion approach and of aquifer behavior at local scale. Finally, maps of hydraulic head spatial variability and of predictions uncertainty are constructed for the area with the two different approaches comparing their advantages and drawbacks.

  9. Point process analyses of variations in smoking rate by setting, mood, gender, and dependence

    PubMed Central

    Shiffman, Saul; Rathbun, Stephen L.

    2010-01-01

    The immediate emotional and situational antecedents of ad libitum smoking are still not well understood. We re-analyzed data from Ecological Momentary Assessment using novel point-process analyses, to assess how craving, mood, and social setting influence smoking rate, as well as assessing the moderating effects of gender and nicotine dependence. 304 smokers recorded craving, mood, and social setting using electronic diaries when smoking and at random nonsmoking times over 16 days of smoking. Point-process analysis, which makes use of the known random sampling scheme for momentary variables, examined main effects of setting and interactions with gender and dependence. Increased craving was associated with higher rates of smoking, particularly among women. Negative affect was not associated with smoking rate, even in interaction with arousal, but restlessness was associated with substantially higher smoking rates. Women's smoking tended to be less affected by negative affect. Nicotine dependence had little moderating effect on situational influences. Smoking rates were higher when smokers were alone or with others smoking, and smoking restrictions reduced smoking rates. However, the presence of others smoking undermined the effects of restrictions. The more sensitive point-process analyses confirmed earlier findings, including the surprising conclusion that negative affect by itself was not related to smoking rates. Contrary to hypothesis, men's and not women's smoking was influenced by negative affect. Both smoking restrictions and the presence of others who are not smoking suppress smoking, but others’ smoking undermines the effects of restrictions. Point-process analyses of EMA data can bring out even small influences on smoking rate. PMID:21480683

  10. Models of formation and some algorithms of hyperspectral image processing

    NASA Astrophysics Data System (ADS)

    Achmetov, R. N.; Stratilatov, N. R.; Yudakov, A. A.; Vezenov, V. I.; Eremeev, V. V.

    2014-12-01

    Algorithms and information technologies for processing Earth hyperspectral imagery are presented. Several new approaches are discussed. Peculiar properties of processing the hyperspectral imagery, such as multifold signal-to-noise reduction, atmospheric distortions, access to spectral characteristics of every image point, and high dimensionality of data, were studied. Different measures of similarity between individual hyperspectral image points and the effect of additive uncorrelated noise on these measures were analyzed. It was shown that these measures are substantially affected by noise, and a new measure free of this disadvantage was proposed. The problem of detecting the observed scene object boundaries, based on comparing the spectral characteristics of image points, is considered. It was shown that contours are processed much better when spectral characteristics are used instead of energy brightness. A statistical approach to the correction of atmospheric distortions, which makes it possible to solve the stated problem based on analysis of a distorted image in contrast to analytical multiparametric models, was proposed. Several algorithms used to integrate spectral zonal images with data from other survey systems, which make it possible to image observed scene objects with a higher quality, are considered. Quality characteristics of hyperspectral data processing were proposed and studied.

  11. Identifying, Assessing, and Mitigating Risk of Single-Point Inspections on the Space Shuttle Reusable Solid Rocket Motor

    NASA Technical Reports Server (NTRS)

    Greenhalgh, Phillip O.

    2004-01-01

    In the production of each Space Shuttle Reusable Solid Rocket Motor (RSRM), over 100,000 inspections are performed. ATK Thiokol Inc. reviewed these inspections to ensure a robust inspection system is maintained. The principal effort within this endeavor was the systematic identification and evaluation of inspections considered to be single-point. Single-point inspections are those accomplished on components, materials, and tooling by only one person, involving no other check. The purpose was to more accurately characterize risk and ultimately address and/or mitigate risk associated with single-point inspections. After the initial review of all inspections and identification/assessment of single-point inspections, review teams applied risk prioritization methodology similar to that used in a Process Failure Modes Effects Analysis to derive a Risk Prioritization Number for each single-point inspection. After the prioritization of risk, all single-point inspection points determined to have significant risk were provided either with risk-mitigating actions or rationale for acceptance. This effort gave confidence to the RSRM program that the correct inspections are being accomplished, that there is appropriate justification for those that remain as single-point inspections, and that risk mitigation was applied to further reduce risk of higher risk single-point inspections. This paper examines the process, results, and lessons learned in identifying, assessing, and mitigating risk associated with single-point inspections accomplished in the production of the Space Shuttle RSRM.

  12. 1/f Noise from nonlinear stochastic differential equations.

    PubMed

    Ruseckas, J; Kaulakys, B

    2010-03-01

    We consider a class of nonlinear stochastic differential equations, giving the power-law behavior of the power spectral density in any desirably wide range of frequency. Such equations were obtained starting from the point process models of 1/fbeta noise. In this article the power-law behavior of spectrum is derived directly from the stochastic differential equations, without using the point process models. The analysis reveals that the power spectrum may be represented as a sum of the Lorentzian spectra. Such a derivation provides additional justification of equations, expands the class of equations generating 1/fbeta noise, and provides further insights into the origin of 1/fbeta noise.

  13. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  14. Summary of Skylab S-193 altimeter altitude results. [orbit calculation and studies of the ocean bottom

    NASA Technical Reports Server (NTRS)

    Mcgoogan, J. T.; Leitao, C. D.; Wells, W. T.

    1975-01-01

    The SKYLAB S-193 altimeter altitude results are presented in a concise format for further use and analysis by the scientific community. The altimeter mission and instrumentation is described along with the altimeter processing techniques and values of parameters used for processing. The determination of reference orbits is discussed, and the tracking systems utilized are tabulated. Techniques for determining satellite pointing are presented and a tabulation of pointing for each data mission included. The geographical location, the ocean bottom topography, the altimeter-determined ocean surface topography, and the altimeter automatic gain control history is presented. Some typical applications of this data are suggested.

  15. GEOS 3 data processing for the recovery of geoid undulations and gravity anomalies

    NASA Technical Reports Server (NTRS)

    Rapp, R. H.

    1979-01-01

    The paper discusses the analysis of GEOS 3 altimeter data for the determination of geoid heights and point and mean gravity anomalies. Methods are presented for determining the mean anomalies and mean undulations from the GEOS 3 altimeter data available by the end of September 1977 without having a complete set of precise orbits. The editing of the data is extensive to remove questionable data, although no filtering of the data is carried out. An adjustment process is carried out to eliminate orbit error and altimeter bias. Representative point anomaly values are computed to investigate anomaly behavior across the Bonin Trench and over the Patton seamounts.

  16. Ka-band monopulse antenna-pointing systems analysis and simulation

    NASA Technical Reports Server (NTRS)

    Lo, V. Y.

    1996-01-01

    NASA 's Deep Space Network (DSN) has been using both 70-m and 34-m reflector antennas to communicate with spacecraft at S-band (2.3 GHz) and X-band (8.45 GHz). To improve the quality of telecommunication and to meet future mission requirements, JPL has been developing 34-m Ka-band (32-GHz) beam waveguide antennas. Presently, antenna pointing operates in either the open-loop mode with blind pointing using navigation predicts or the closed-loop mode with conical scan (conscan). Pointing accuracy under normal conscan operating conditions is in the neighborhood of 5 mdeg. This is acceptable at S- and X-bands, but not enough at Ka-band. Due to the narrow beamwidth at Ka-band, it is important to improve pointing accuracy significantly (approximately 2 mdeg). Monopulse antenna tracking is one scheme being developed to meet the stringent pointing-accuracy requirement at Ka-band. Other advantages of monopulse tracking include low sensitivity to signal amplitude fluctuations as well as single-pulse processing for acquisition and tracking. This article presents system modeling, signal processing, simulation, and implementation of Ka-band monopulse tracking feed for antennas in NASA/DSN ground stations.

  17. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    NASA Astrophysics Data System (ADS)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of the DEM difference to analyze the surface changes in 3D. The automated point cloud generation and analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the point cloud processing tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study allow a 3D estimation of the topographic changes within the tectonically active and anthropogenically invaded Malin area after the landslide event. Accordingly, the point cloud analysis was correlated successfully with modelled displacement contours of the slope. Based on optical satellite imagery, such point clouds of high precision and density distribution can be obtained in a few minutes to support the operational monitoring of landslide processes.

  18. Scalets, wavelets and (complex) turning point quantization

    NASA Astrophysics Data System (ADS)

    Handy, C. R.; Brooks, H. A.

    2001-05-01

    Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.

  19. Thermal properties of polyethylene reinforced with recycled–poly (ethylene terephthalate) flakes.

    NASA Astrophysics Data System (ADS)

    Ruqiyah Nik Hassan, Nik; Mazni Ismail, Noor; Ghazali, Suriati; Nuruzzaman, Dewan Muhammad

    2018-04-01

    In this study, recycled plastic bottles (RPET) were used as a filler in high density polyethylene (HDPE) thermoplastic. The plastic sheet of RPET/HDPE was prepared by using hot and cold press machine. The effects of RPET addition and hot press process to the thermal properties of the composite RPET/HDPE were investigated using differential scanning calorimetry (DSC) and thermogravimetric (TGA). Results from DSC analysis show that the melting point of HDPE slightly shifted to a higher temperature for about 2°C to 4°C with the addition of RPET as a filler. The starting degradation temperature of RPET/HDPE composite examined from TGA analysis also seen to be slightly increased. It was observed that the incorporation of recycled PET flakes into HDPE is achievable using hot press process with slight improvement seen in both melting point and thermal stability of the composite compared to the neat HDPE.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinho, Graca; Pires, Ana, E-mail: ana.lourenco.pires@gmail.com; Saraiva, Luanha

    Highlights: Black-Right-Pointing-Pointer The article shows WEEE plastics characterization from a recycling unit in Portugal. Black-Right-Pointing-Pointer The recycling unit has low machinery, with hand sorting of plastics elements. Black-Right-Pointing-Pointer Most common polymers are PS, ABS, PC/ABS, HIPS and PP. Black-Right-Pointing-Pointer Most plastics found have no identification of plastic type or flame retardants. Black-Right-Pointing-Pointer Ecodesign is still not practiced for EEE, with repercussions in end of life stage. - Abstract: This paper describes a direct analysis study carried out in a recycling unit for waste electrical and electronic equipment (WEEE) in Portugal to characterize the plastic constituents of WEEE. Approximately 3400 items,more » including cooling appliances, small WEEE, printers, copying equipment, central processing units, cathode ray tube (CRT) monitors and CRT televisions were characterized, with the analysis finding around 6000 kg of plastics with several polymer types. The most common polymers are polystyrene, acrylonitrile-butadiene-styrene, polycarbonate blends, high-impact polystyrene and polypropylene. Additives to darken color are common contaminants in these plastics when used in CRT televisions and small WEEE. These additives can make plastic identification difficult, along with missing polymer identification and flame retardant identification marks. These drawbacks contribute to the inefficiency of manual dismantling of WEEE, which is the typical recycling process in Portugal. The information found here can be used to set a baseline for the plastics recycling industry and provide information for ecodesign in electrical and electronic equipment production.« less

  1. Using stochastic models to incorporate spatial and temporal variability [Exercise 14

    Treesearch

    Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke

    2003-01-01

    To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...

  2. Fitting a Point Cloud to a 3d Polyhedral Surface

    NASA Astrophysics Data System (ADS)

    Popov, E. V.; Rotkov, S. I.

    2017-05-01

    The ability to measure parameters of large-scale objects in a contactless fashion has a tremendous potential in a number of industrial applications. However, this problem is usually associated with an ambiguous task to compare two data sets specified in two different co-ordinate systems. This paper deals with the study of fitting a set of unorganized points to a polyhedral surface. The developed approach uses Principal Component Analysis (PCA) and Stretched grid method (SGM) to substitute a non-linear problem solution with several linear steps. The squared distance (SD) is a general criterion to control the process of convergence of a set of points to a target surface. The described numerical experiment concerns the remote measurement of a large-scale aerial in the form of a frame with a parabolic shape. The experiment shows that the fitting process of a point cloud to a target surface converges in several linear steps. The method is applicable to the geometry remote measurement of large-scale objects in a contactless fashion.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amdursky, Nadav; Gazit, Ehud; Rosenman, Gil, E-mail: gilr@eng.tau.ac.il

    Highlights: Black-Right-Pointing-Pointer We observe lag-phase crystallization process in insulin. Black-Right-Pointing-Pointer The crystallization is a result of the formation of higher order oligomers. Black-Right-Pointing-Pointer The crystallization also changes the secondary structure of the protein. Black-Right-Pointing-Pointer The spectroscopic signature can be used for amyloid inhibitors assay. -- Abstract: Insulin, as other amyloid proteins, can form amyloid fibrils at certain conditions. The self-assembled aggregation process of insulin can result in a variety of conformations, starting from small oligomers, going through various types of protofibrils, and finishing with bundles of fibrils. One of the most common consensuses among the various self-assembly processes that aremore » suggested in the literature is the formation of an early stage nucleus conformation. Here we present an additional insight for the self-assembly process of insulin. We show that at the early lag phase of the process (prior to fibril formation) the insulin monomers self-assemble into ordered nanostructures. The most notable feature of this early self-assembly process is the formation of nanocrystalline nucleus regions with a strongly bound electron-hole confinement, which also change the secondary structure of the protein. Each step in the self-assembly process is characterized by an optical spectroscopic signature, and possesses a narrow size distribution. By following the spectroscopic signature we can measure the potency of amyloid fibrils inhibitors already at the lag phase. We further demonstrate it by the use of epigallocatechin gallate, a known inhibitor for insulin fibrils. The findings can result in a spectroscopic-based application for the analysis of amyloid fibrils inhibitors.« less

  4. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    PubMed

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  5. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  6. Thermal processing of a poorly water-soluble drug substance exhibiting a high melting point: the utility of KinetiSol® Dispersing.

    PubMed

    Hughey, Justin R; Keen, Justin M; Brough, Chris; Saeger, Sophie; McGinity, James W

    2011-10-31

    Poorly water-soluble drug substances that exhibit high melting points are often difficult to successfully process by fusion-based techniques. The purpose of this study was to identify a suitable polymer system for meloxicam (MLX), a high melting point class II BCS compound, and investigate thermal processing techniques for the preparation of chemically stable single phase solid dispersions. Thermal and solution based screening techniques were utilized to screen hydrophilic polymers suitable for immediate release formulations. Results of the screening studies demonstrated that Soluplus(®)(SOL) provided the highest degree of miscibility and solubility enhancement. A hot-melt extrusion feasibility study demonstrated that high temperatures and extended residence times were required in order to render compositions amorphous, causing significant degradation of MLX. A design of experiments (DOE) was conducted on the KinetiSol(®) Dispersing (KSD) process to evaluate the effect of processing conditions on the chemical stability and amorphous character of MLX. The study demonstrated that ejection temperature significantly impacted MLX stability. All samples prepared by KSD were substantially amorphous. Dissolution analysis of the KSD processed solid dispersions showed increased dissolution rates and extent of supersaturation over the marketed generic MLX tablets. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. A Lidar Point Cloud Based Procedure for Vertical Canopy Structure Analysis And 3D Single Tree Modelling in Forest

    PubMed Central

    Wang, Yunsheng; Weinacker, Holger; Koch, Barbara

    2008-01-01

    A procedure for both vertical canopy structure analysis and 3D single tree modelling based on Lidar point cloud is presented in this paper. The whole area of research is segmented into small study cells by a raster net. For each cell, a normalized point cloud whose point heights represent the absolute heights of the ground objects is generated from the original Lidar raw point cloud. The main tree canopy layers and the height ranges of the layers are detected according to a statistical analysis of the height distribution probability of the normalized raw points. For the 3D modelling of individual trees, individual trees are detected and delineated not only from the top canopy layer but also from the sub canopy layer. The normalized points are resampled into a local voxel space. A series of horizontal 2D projection images at the different height levels are then generated respect to the voxel space. Tree crown regions are detected from the projection images. Individual trees are then extracted by means of a pre-order forest traversal process through all the tree crown regions at the different height levels. Finally, 3D tree crown models of the extracted individual trees are reconstructed. With further analyses on the 3D models of individual tree crowns, important parameters such as crown height range, crown volume and crown contours at the different height levels can be derived. PMID:27879916

  8. Component Provider’s and Tool Developer’s Handbook. Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-03-25

    metrics [DISA93b]. " The Software Engineering Institute (SET) has developed a domain analysis process (Feature-Oriented Domain Analysis - FODA ) and is...and expresses the range of variability of these decisions. 3.2.2.3 Feature Oriented Domain Analysis Feature Oriented Domain Analysis ( FODA ) is a domain...documents created in this phase. From a purely profit-oriented business point of view, a company may develop its own analysis of a government or commercial

  9. Characterization of ceramic powders by an X-ray measuring method

    NASA Technical Reports Server (NTRS)

    Ziegler, B.

    1983-01-01

    X-ray line broadening analysis gives quantitative data on structural changes of ceramic powders after different processing steps. Various Al2O3 powders were investigated and the following points are discussed on the basis of these results: X-ray line broadening analysis, structural changes during grinding, structural changes during annealing, influence of structural properties on sintering behavior and application of line broadening analysis to quality control of powders.

  10. [Image processing applying in analysis of motion features of cultured cardiac myocyte in rat].

    PubMed

    Teng, Qizhi; He, Xiaohai; Luo, Daisheng; Wang, Zhengrong; Zhou, Beiyi; Yuan, Zhirun; Tao, Dachang

    2007-02-01

    Study of mechanism of medicine actions, by quantitative analysis of cultured cardiac myocyte, is one of the cutting edge researches in myocyte dynamics and molecular biology. The characteristics of cardiac myocyte auto-beating without external stimulation make the research sense. Research of the morphology and cardiac myocyte motion using image analysis can reveal the fundamental mechanism of medical actions, increase the accuracy of medicine filtering, and design the optimal formula of medicine for best medical treatments. A system of hardware and software has been built with complete sets of functions including living cardiac myocyte image acquisition, image processing, motion image analysis, and image recognition. In this paper, theories and approaches are introduced for analysis of living cardiac myocyte motion images and implementing quantitative analysis of cardiac myocyte features. A motion estimation algorithm is used for motion vector detection of particular points and amplitude and frequency detection of a cardiac myocyte. Beatings of cardiac myocytes are sometimes very small. In such case, it is difficult to detect the motion vectors from the particular points in a time sequence of images. For this reason, an image correlation theory is employed to detect the beating frequencies. Active contour algorithm in terms of energy function is proposed to approximate the boundary and detect the changes of edge of myocyte.

  11. SCATHA (Spacecraft Charging AT High Altitudes) Plasma Interaction Experiment: SC-3 High Energy Particle Spectrometer; SC-8 Energetic Ion Composition Experiment.

    DTIC Science & Technology

    1984-11-30

    fluxes have been processed into a computer data base, ready for further analysis. This data base has been the starting point for several of the above...distance from the point of observation. One very common distribution consists of field-aligned ions at energies below several keV, with more energetic...BE DOW POINTS EVERY 16 SM. IN THE SECOND M, ijv.sTIGA . (3) TE FLASMA AND FIE.D COIDITIS THE ELOCTY FILIE IS LOCKED IN OW OF FOUR .fiHAT PRODUCEE TW

  12. Report on 3 and 4-point correlation statistics in the COBE DMR anisotrophy maps

    NASA Technical Reports Server (NTRS)

    Hinshaw, Gary (Principal Investigator); Gorski, Krzystof M.; Banday, Anthony J.; Bennett, Charles L.

    1996-01-01

    As part of the work performed under NASA contract # NAS5-32648, we have computed the 3-point and 4-point correlation functions of the COBE-DNIR 2-year and 4-year anisotropy maps. The motivation for this study was to search for evidence of non-Gaussian statistical fluctuations in the temperature maps: skewness or asymmetry in the case of the 3-point function, kurtosis in the case of the 4-point function. Such behavior would have very significant implications for our understanding of the processes of galaxy formation, because our current models of galaxy formation predict that non-Gaussian features should not be present in the DMR maps. The results of our work showed that the 3-point correlation function is consistent with zero and that the 4-point function is not a very sensitive probe of non-Gaussian behavior in the COBE-DMR data. Our computation and analysis of 3-point correlations in the 2-year DMR maps was published in the Astrophysical Journal Letters, volume 446, page L67, 1995. Our computation and analysis of 3-point correlations in the 4-year DMR maps will be published, together with some additional tests, in the June 10, 1996 issue of the Astrophysical Journal Letters. Copies of both of these papers are attached as an appendix to this report.

  13. Wavelets and molecular structure

    NASA Astrophysics Data System (ADS)

    Carson, Mike

    1996-08-01

    The wavelet method offers possibilities for display, editing, and topological comparison of proteins at a user-specified level of detail. Wavelets are a mathematical tool that first found application in signal processing. The multiresolution analysis of a signal via wavelets provides a hierarchical series of `best' lower-resolution approximations. B-spline ribbons model the protein fold, with one control point per residue. Wavelet analysis sets limits on the information required to define the winding of the backbone through space, suggesting a recognizable fold is generated from a number of points equal to 1/4 or less the number of residues. Wavelets applied to surfaces and volumes show promise in structure-based drug design.

  14. Complex eigenvalue extraction in NASTRAN by the tridiagonal reduction (FEER) method

    NASA Technical Reports Server (NTRS)

    Newman, M.; Mann, F. I.

    1977-01-01

    An extension of the Tridiagonal Reduction (FEER) method to complex eigenvalue analysis in NASTRAN is described. As in the case of real eigenvalue analysis, the eigensolutions closest to a selected point in the eigenspectrum are extracted from a reduced, symmetric, tridiagonal eigenmatrix whose order is much lower than that of the full size problem. The reduction process is effected automatically, and thus avoids the arbitrary lumping of masses and other physical quantities at selected grid points. The statement of the algebraic eigenvalue problem admits mass, damping and stiffness matrices which are unrestricted in character, i.e., they may be real, complex, symmetric or unsymmetric, singular or non-singular.

  15. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  16. Reflective Practice: The Scholarship of Teaching and Learning. The CEET Faculty Development Program on Teaching and Learning. Second Edition: 2009 College Portfolio. Volumes I-IV

    ERIC Educational Resources Information Center

    Scarborough, Jule Dee

    2009-01-01

    "2009 Portfolio: The Second Edition of the College of Engineering's Portfolio" presents the 2009 Faculty Development Program on Teaching & Learning (TL) new content, modified models, new process and procedures, especially the new Instructional Analysis and Design Process Map, new PowerPoint presentations, modified teaching and…

  17. Pattern analysis of community health center location in Surabaya using spatial Poisson point process

    NASA Astrophysics Data System (ADS)

    Kusumaningrum, Choriah Margareta; Iriawan, Nur; Winahju, Wiwiek Setya

    2017-11-01

    Community health center (puskesmas) is one of the closest health service facilities for the community, which provide healthcare for population on sub-district level as one of the government-mandated community health clinics located across Indonesia. The increasing number of this puskesmas does not directly comply the fulfillment of basic health services needed in such region. Ideally, a puskesmas has to cover up to maximum 30,000 people. The number of puskesmas in Surabaya indicates an unbalance spread in all of the area. This research aims to analyze the spread of puskesmas in Surabaya using spatial Poisson point process model in order to get the effective location of Surabaya's puskesmas which based on their location. The results of the analysis showed that the distribution pattern of puskesmas in Surabaya is non-homogeneous Poisson process and is approched by mixture Poisson model. Based on the estimated model obtained by using Bayesian mixture model couple with MCMC process, some characteristics of each puskesmas have no significant influence as factors to decide the addition of health center in such location. Some factors related to the areas of sub-districts have to be considered as covariate to make a decision adding the puskesmas in Surabaya.

  18. Spatial Statistics for Tumor Cell Counting and Classification

    NASA Astrophysics Data System (ADS)

    Wirjadi, Oliver; Kim, Yoo-Jin; Breuel, Thomas

    To count and classify cells in histological sections is a standard task in histology. One example is the grading of meningiomas, benign tumors of the meninges, which requires to assess the fraction of proliferating cells in an image. As this process is very time consuming when performed manually, automation is required. To address such problems, we propose a novel application of Markov point process methods in computer vision, leading to algorithms for computing the locations of circular objects in images. In contrast to previous algorithms using such spatial statistics methods in image analysis, the present one is fully trainable. This is achieved by combining point process methods with statistical classifiers. Using simulated data, the method proposed in this paper will be shown to be more accurate and more robust to noise than standard image processing methods. On the publicly available SIMCEP benchmark for cell image analysis algorithms, the cell count performance of the present paper is significantly more accurate than results published elsewhere, especially when cells form dense clusters. Furthermore, the proposed system performs as well as a state-of-the-art algorithm for the computer-aided histological grading of meningiomas when combined with a simple k-nearest neighbor classifier for identifying proliferating cells.

  19. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    PubMed

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  20. A Bayesian cluster analysis method for single-molecule localization microscopy data.

    PubMed

    Griffié, Juliette; Shannon, Michael; Bromley, Claire L; Boelen, Lies; Burn, Garth L; Williamson, David J; Heard, Nicholas A; Cope, Andrew P; Owen, Dylan M; Rubin-Delanchy, Patrick

    2016-12-01

    Cell function is regulated by the spatiotemporal organization of the signaling machinery, and a key facet of this is molecular clustering. Here, we present a protocol for the analysis of clustering in data generated by 2D single-molecule localization microscopy (SMLM)-for example, photoactivated localization microscopy (PALM) or stochastic optical reconstruction microscopy (STORM). Three features of such data can cause standard cluster analysis approaches to be ineffective: (i) the data take the form of a list of points rather than a pixel array; (ii) there is a non-negligible unclustered background density of points that must be accounted for; and (iii) each localization has an associated uncertainty in regard to its position. These issues are overcome using a Bayesian, model-based approach. Many possible cluster configurations are proposed and scored against a generative model, which assumes Gaussian clusters overlaid on a completely spatially random (CSR) background, before every point is scrambled by its localization precision. We present the process of generating simulated and experimental data that are suitable to our algorithm, the analysis itself, and the extraction and interpretation of key cluster descriptors such as the number of clusters, cluster radii and the number of localizations per cluster. Variations in these descriptors can be interpreted as arising from changes in the organization of the cellular nanoarchitecture. The protocol requires no specific programming ability, and the processing time for one data set, typically containing 30 regions of interest, is ∼18 h; user input takes ∼1 h.

  1. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    PubMed

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  2. 40 CFR 63.526 - Monitoring requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...

  3. 40 CFR 63.526 - Monitoring requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...

  4. 40 CFR 63.526 - Monitoring requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... vent. (D) Design analysis based on accepted chemical engineering principles, measurable process.... (i) For the purpose of determining de minimis status for emission points, engineering assessment may... operating conditions expected to yield the highest flow rate and concentration. Engineering assessment...

  5. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  6. Salmonella contamination risk points in broiler carcasses during slaughter line processing.

    PubMed

    Rivera-Pérez, Walter; Barquero-Calvo, Elías; Zamora-Sanabria, Rebeca

    2014-12-01

    Salmonella is one of the foodborne pathogens most commonly associated with poultry products. The aim of this work was to identify and analyze key sampling points creating risk of Salmonella contamination in a chicken processing plant in Costa Rica and perform a salmonellosis risk analysis. Accordingly, the following examinations were performed: (i) qualitative testing (presence or absence of Salmonella), (ii) quantitative testing (Salmonella CFU counts), and (iii) salmonellosis risk analysis, assuming consumption of contaminated meat from the processing plant selected. Salmonella was isolated in 26% of the carcasses selected, indicating 60% positive in the flocks sampled. The highest Salmonella counts were observed after bleeding (6.1 log CFU per carcass), followed by a gradual decrease during the subsequent control steps. An increase in the percentage of contamination (10 to 40%) was observed during evisceration and spray washing (after evisceration), with Salmonella counts increasing from 3.9 to 5.1 log CFU per carcass. According to the prevalence of Salmonella -contaminated carcasses released to trade (20%), we estimated a risk of 272 cases of salmonellosis per year as a result of the consumption of contaminated chicken. Our study suggests that the processes of evisceration and spray washing represent a risk of Salmonella cross-contamination and/ or recontamination in broilers during slaughter line processing.

  7. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents.

    PubMed

    Teichgräber, Ulf K; de Bucourt, Maximilian

    2012-01-01

    OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Pin routability and pin access analysis on standard cells for layout optimization

    NASA Astrophysics Data System (ADS)

    Chen, Jian; Wang, Jun; Zhu, ChengYu; Xu, Wei; Li, Shuai; Lin, Eason; Ou, Odie; Lai, Ya-Chieh; Qu, Shengrui

    2018-03-01

    At advanced process nodes, especially at sub-28nm technology, pin accessibility and routability of standard cells has become one of the most challenging design issues due to the limited router tracks and the increased pin density. If this issue can't be found and resolved during the cell design stage, the pin access problem will be very difficult to be fixed in implementation stage and will make the low efficiency for routing. In this paper, we will introduce a holistic approach for the pin accessibility scoring and routability analysis. For accessibility, the systematic calculator which assigns score for each pin will search the available access points, consider the surrounded router layers, basic design rule and allowed via geometry. Based on the score, the "bad" pins can be found and modified. On pin routability analysis, critical pin points (placing via on this point would lead to failed via insertion) will be searched out for either layout optimization guide or set as OBS for via insertion blocking. By using this pin routability and pin access analysis flow, we are able to improve the library quality and performance.

  9. Single quantum dot analysis enables multiplexed point mutation detection by gap ligase chain reaction.

    PubMed

    Song, Yunke; Zhang, Yi; Wang, Tza-Huei

    2013-04-08

    Gene point mutations present important biomarkers for genetic diseases. However, existing point mutation detection methods suffer from low sensitivity, specificity, and a tedious assay processes. In this report, an assay technology is proposed which combines the outstanding specificity of gap ligase chain reaction (Gap-LCR), the high sensitivity of single-molecule coincidence detection, and the superior optical properties of quantum dots (QDs) for multiplexed detection of point mutations in genomic DNA. Mutant-specific ligation products are generated by Gap-LCR and subsequently captured by QDs to form DNA-QD nanocomplexes that are detected by single-molecule spectroscopy (SMS) through multi-color fluorescence burst coincidence analysis, allowing for multiplexed mutation detection in a separation-free format. The proposed assay is capable of detecting zeptomoles of KRAS codon 12 mutation variants with near 100% specificity. Its high sensitivity allows direct detection of KRAS mutation in crude genomic DNA without PCR pre-amplification. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Alternative Methods for Estimating Plane Parameters Based on a Point Cloud

    NASA Astrophysics Data System (ADS)

    Stryczek, Roman

    2017-12-01

    Non-contact measurement techniques carried out using triangulation optical sensors are increasingly popular in measurements with the use of industrial robots directly on production lines. The result of such measurements is often a cloud of measurement points that is characterized by considerable measuring noise, presence of a number of points that differ from the reference model, and excessive errors that must be eliminated from the analysis. To obtain vector information points contained in the cloud that describe reference models, the data obtained during a measurement should be subjected to appropriate processing operations. The present paperwork presents an analysis of suitability of methods known as RANdom Sample Consensus (RANSAC), Monte Carlo Method (MCM), and Particle Swarm Optimization (PSO) for the extraction of the reference model. The effectiveness of the tested methods is illustrated by examples of measurement of the height of an object and the angle of a plane, which were made on the basis of experiments carried out at workshop conditions.

  11. [Recovering helpers in the addiction treatment system in Hungary: an interpretative phenomenological analysis].

    PubMed

    Kassai, Szilvia; Pintér, Judit Nóra; Rácz, József

    2015-01-01

    The work of recovering helpers who work in the addiction rehabilitation centres was studied. The aim was to investigate the process of addicts becoming recovering helpers, and to study what peer help means to them. According to interpretative phenomenological analysis (IPA) design, subjects were selected, data were collected and analysed. 6 (5 males, 1 female), working as recovering helpers at least one year at addiction rehabilitation centres. Semi-structured life interviews were carried out and analysed according to IPA. Emerging themes from the interviews were identified and summarized, then interpreted as central themes: important periods and turning points of the life story interviews: the experience of psychoactive drugs use, the development of the addiction (which became " Turning Point No 1") then the "rock bottom" experience ("Turning Point No 2"). Then the experience of the helping process was examined: here four major themes were identified: the development of the recovering self and the helping self, the wounded helper and the skilled helper, the experience of the helping process. IPA was found to be a useful method for idiographic exploration of the development and the work of the recovering helpers. The work of the recovering helpers can be described as mentoring of the addict clients. Our experiences might be used for the training programs for recovering helpers as well as to adopt their professional role in addiction services.

  12. Composite laminate failure parameter optimization through four-point flexure experimentation and analysis

    DOE PAGES

    Nelson, Stacy; English, Shawn; Briggs, Timothy

    2016-05-06

    Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less

  13. A two-phase Poisson process model and its application to analysis of cancer mortality among A-bomb survivors.

    PubMed

    Ohtaki, Megu; Tonda, Tetsuji; Aihara, Kazuyuki

    2015-10-01

    We consider a two-phase Poisson process model where only early successive transitions are assumed to be sensitive to exposure. In the case where intensity transitions are low, we derive analytically an approximate formula for the distribution of time to event for the excess hazard ratio (EHR) due to a single point exposure. The formula for EHR is a polynomial in exposure dose. Since the formula for EHR contains no unknown parameters except for the number of total stages, number of exposure-sensitive stages, and a coefficient of exposure effect, it is applicable easily under a variety of situations where there exists a possible latency time from a single point exposure to occurrence of event. Based on the multistage hypothesis of cancer, we formulate a radiation carcinogenesis model in which only some early consecutive stages of the process are sensitive to exposure, whereas later stages are not affected. An illustrative analysis using the proposed model is given for cancer mortality among A-bomb survivors. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Analyzing linear spatial features in ecology.

    PubMed

    Buettel, Jessie C; Cole, Andrew; Dickey, John M; Brook, Barry W

    2018-06-01

    The spatial analysis of dimensionless points (e.g., tree locations on a plot map) is common in ecology, for instance using point-process statistics to detect and compare patterns. However, the treatment of one-dimensional linear features (fiber processes) is rarely attempted. Here we appropriate the methods of vector sums and dot products, used regularly in fields like astrophysics, to analyze a data set of mapped linear features (logs) measured in 12 × 1-ha forest plots. For this demonstrative case study, we ask two deceptively simple questions: do trees tend to fall downhill, and if so, does slope gradient matter? Despite noisy data and many potential confounders, we show clearly that topography (slope direction and steepness) of forest plots does matter to treefall. More generally, these results underscore the value of mathematical methods of physics to problems in the spatial analysis of linear features, and the opportunities that interdisciplinary collaboration provides. This work provides scope for a variety of future ecological analyzes of fiber processes in space. © 2018 by the Ecological Society of America.

  15. Analysis of Student Satisfaction in The Process of Teaching and Learning Using Importance Performance Analysis

    NASA Astrophysics Data System (ADS)

    Sembiring, P.; Sembiring, S.; Tarigan, G.; Sembiring, OD

    2017-12-01

    This study aims to determine the level of student satisfaction in the learning process at the University of Sumatra Utara, Indonesia. The sample size of the study consisted 1204 students. Students’ response measured through questionnaires an adapted on a 5-point likert scale and interviews directly to the respondent. SERVQUAL method used to measure the quality of service with five dimensions of service characteristics, namely, physical evidence, reliability, responsiveness, assurance and concern. The result of Importance Performance Analysis reveals that six services attributes must be corrected by policy maker of University Sumatera Utara. The quality of service is still considered low by students.

  16. Developing an intelligence analysis process through social network analysis

    NASA Astrophysics Data System (ADS)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  17. Experimental research and numerical optimisation of multi-point sheet metal forming implementation using a solid elastic cushion system

    NASA Astrophysics Data System (ADS)

    Tolipov, A. A.; Elghawail, A.; Shushing, S.; Pham, D.; Essa, K.

    2017-09-01

    There is a growing demand for flexible manufacturing techniques that meet the rapid changes in customer needs. A finite element analysis numerical optimisation technique was used to optimise the multi-point sheet forming process. Multi-point forming (MPF) is a flexible sheet metal forming technique where the same tool can be readily changed to produce different parts. The process suffers from some geometrical defects such as wrinkling and dimpling, which have been found to be the cause of the major surface quality problems. This study investigated the influence of parameters such as the elastic cushion hardness, blank holder force, coefficient of friction, cushion thickness and radius of curvature, on the quality of parts formed in a flexible multi-point stamping die. For those reasons, in this investigation, a multipoint forming stamping process using a blank holder was carried out in order to study the effects of the wrinkling, dimpling, thickness variation and forming force. The aim was to determine the optimum values of these parameters. Finite element modelling (FEM) was employed to simulate the multi-point forming of hemispherical shapes. Using the response surface method, the effects of process parameters on wrinkling, maximum deviation from the target shape and thickness variation were investigated. The results show that elastic cushion with proper thickness and polyurethane with the hardness of Shore A90. It has also been found that the application of lubrication cans improve the shape accuracy of the formed workpiece. These final results were compared with the numerical simulation results of the multi-point forming for hemispherical shapes using a blank-holder and it was found that using cushion hardness realistic to reduce wrinkling and maximum deviation.

  18. Noise characteristics of the Skylab S-193 altimeter altitude measurements

    NASA Technical Reports Server (NTRS)

    Hatch, W. E.

    1975-01-01

    The statistical characteristics of the SKYLAB S-193 altimeter altitude noise are considered. These results are reported in a concise format for use and analysis by the scientific community. In most instances the results have been grouped according to satellite pointing so that the effects of pointing on the statistical characteristics can be readily seen. The altimeter measurements and the processing techniques are described. The mathematical descriptions of the computer programs used for these results are included.

  19. Evaluation of Time Spent by Pharmacists and Nurses Based on the Location of Pharmacist Involvement in Medication History Collection.

    PubMed

    Chhabra, Anmol; Quinn, Andrea; Ries, Amanda

    2018-01-01

    Accurate history collection is integral to medication reconciliation. Studies support pharmacy involvement in the process, but assessment of global time spent is limited. The authors hypothesized the location of a medication-focused interview would impact time spent. The objective was to compare time spent by pharmacists and nurses based on the location of a medication-focused interview. Time spent by the interviewing pharmacist, admitting nurse, and centralized pharmacist verifying admission orders was collected. Patient groups were based on whether the interview was conducted in the emergency department (ED) or medical floor. The primary end point was a composite of the 3 time points. Secondary end points were individual time components and number and types of transcription discrepancies identified during medical floor interviews. Pharmacists and nurses spent an average of ten fewer minutes per ED patient versus a medical floor patient ( P = .028). Secondary end points were not statistically significant. Transcription discrepancies were identified at a rate of 1 in 4 medications. Post hoc analysis revealed the time spent by pharmacists and nurses was 2.4 minutes shorter per medication when interviewed in the ED ( P < .001). The primary outcome was statistically and clinically significant. Limitations included inability to blind and lack of cost-saving analysis. Pharmacist involvement in ED medication reconciliation leads to time savings during the admission process.

  20. Image monitoring of pharmaceutical blending processes and the determination of an end point by using a portable near-infrared imaging device based on a polychromator-type near-infrared spectrometer with a high-speed and high-resolution photo diode array detector.

    PubMed

    Murayama, Kodai; Ishikawa, Daitaro; Genkawa, Takuma; Sugino, Hiroyuki; Komiyama, Makoto; Ozaki, Yukihiro

    2015-03-03

    In the present study we have developed a new version (ND-NIRs) of a polychromator-type near-infrared (NIR) spectrometer with a high-resolution photo diode array detector, which we built before (D-NIRs). The new version has four 5 W halogen lamps compared with the three lamps for the older version. The new version also has a condenser lens with a shorter focal point length. The increase in the number of the lamps and the shortening of the focal point of the condenser lens realize high signal-to-noise ratio and high-speed NIR imaging measurement. By using the ND-NIRs we carried out the in-line monitoring of pharmaceutical blending and determined an end point of the blending process. Moreover, to determinate a more accurate end point, a NIR image of the blending sample was acquired by means of a portable NIR imaging device based on ND-NIRs. The imaging result has demonstrated that the mixing time of 8 min is enough for homogeneous mixing. In this way the present study has demonstrated that ND-NIRs and the imaging system based on a ND-NIRs hold considerable promise for process analysis.

  1. Monte Carlo point process estimation of electromyographic envelopes from motor cortical spikes for brain-machine interfaces

    NASA Astrophysics Data System (ADS)

    Liao, Yuxi; She, Xiwei; Wang, Yiwen; Zhang, Shaomin; Zhang, Qiaosheng; Zheng, Xiaoxiang; Principe, Jose C.

    2015-12-01

    Objective. Representation of movement in the motor cortex (M1) has been widely studied in brain-machine interfaces (BMIs). The electromyogram (EMG) has greater bandwidth than the conventional kinematic variables (such as position, velocity), and is functionally related to the discharge of cortical neurons. As the stochastic information of EMG is derived from the explicit spike time structure, point process (PP) methods will be a good solution for decoding EMG directly from neural spike trains. Previous studies usually assume linear or exponential tuning curves between neural firing and EMG, which may not be true. Approach. In our analysis, we estimate the tuning curves in a data-driven way and find both the traditional functional-excitatory and functional-inhibitory neurons, which are widely found across a rat’s motor cortex. To accurately decode EMG envelopes from M1 neural spike trains, the Monte Carlo point process (MCPP) method is implemented based on such nonlinear tuning properties. Main results. Better reconstruction of EMG signals is shown on baseline and extreme high peaks, as our method can better preserve the nonlinearity of the neural tuning during decoding. The MCPP improves the prediction accuracy (the normalized mean squared error) 57% and 66% on average compared with the adaptive point process filter using linear and exponential tuning curves respectively, for all 112 data segments across six rats. Compared to a Wiener filter using spike rates with an optimal window size of 50 ms, MCPP decoding EMG from a point process improves the normalized mean square error (NMSE) by 59% on average. Significance. These results suggest that neural tuning is constantly changing during task execution and therefore, the use of spike timing methodologies and estimation of appropriate tuning curves needs to be undertaken for better EMG decoding in motor BMIs.

  2. Food Processing Control

    NASA Technical Reports Server (NTRS)

    1997-01-01

    When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.

  3. Dynamic performance of maximum power point tracking circuits using sinusoidal extremum seeking control for photovoltaic generation

    NASA Astrophysics Data System (ADS)

    Leyva, R.; Artillan, P.; Cabal, C.; Estibals, B.; Alonso, C.

    2011-04-01

    The article studies the dynamic performance of a family of maximum power point tracking circuits used for photovoltaic generation. It revisits the sinusoidal extremum seeking control (ESC) technique which can be considered as a particular subgroup of the Perturb and Observe algorithms. The sinusoidal ESC technique consists of adding a small sinusoidal disturbance to the input and processing the perturbed output to drive the operating point at its maximum. The output processing involves a synchronous multiplication and a filtering stage. The filter instance determines the dynamic performance of the MPPT based on sinusoidal ESC principle. The approach uses the well-known root-locus method to give insight about damping degree and settlement time of maximum-seeking waveforms. This article shows the transient waveforms in three different filter instances to illustrate the approach. Finally, an experimental prototype corroborates the dynamic analysis.

  4. 40 CFR 63.526 - Monitoring requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...

  5. 40 CFR 63.526 - Monitoring requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (D) Design analysis based on accepted chemical engineering principles, measurable process parameters... purpose of determining de minimis status for emission points, engineering assessment may be used to... expected to yield the highest flow rate and concentration. Engineering assessment includes, but is not...

  6. Reactive nitrogen oxides in the southeast United States national parks: source identification, origin, and process budget

    NASA Astrophysics Data System (ADS)

    Tong, Daniel Quansong; Kang, Daiwen; Aneja, Viney P.; Ray, John D.

    2005-01-01

    We present in this study both measurement-based and modeling analyses for elucidation of source attribution, influence areas, and process budget of reactive nitrogen oxides at two rural southeast United States sites (Great Smoky Mountains national park (GRSM) and Mammoth Cave national park (MACA)). Availability of nitrogen oxides is considered as the limiting factor to ozone production in these areas and the relative source contribution of reactive nitrogen oxides from point or mobile sources is important in understanding why these areas have high ozone. Using two independent observation-based techniques, multiple linear regression analysis and emission inventory analysis, we demonstrate that point sources contribute a minimum of 23% of total NOy at GRSM and 27% at MACA. The influence areas for these two sites, or origins of nitrogen oxides, are investigated using trajectory-cluster analysis. The result shows that air masses from the West and Southwest sweep over GRSM most frequently, while pollutants transported from the eastern half (i.e., East, Northeast, and Southeast) have limited influence (<10% out of all air masses) on air quality at GRSM. The processes responsible for formation and removal of reactive nitrogen oxides are investigated using a comprehensive 3-D air quality model (Multiscale Air Quality SImulation Platform (MAQSIP)). The NOy contribution associated with chemical transformations to NOz and O3, based on process budget analysis, is as follows: 32% and 84% for NOz, and 26% and 80% for O3 at GRSM and MACA, respectively. The similarity between NOz and O3 process budgets suggests a close association between nitrogen oxides and effective O3 production at these rural locations.

  7. Life cycle assessment as a tool for the environmental improvement of the tannery industry in developing countries.

    PubMed

    Rivela, B; Moreira, M T; Bornhardt, C; Méndez, R; Feijoo, G

    2004-03-15

    A representative leather tannery industry in a Latin American developing country has been studied from an environmental point of view, including both technical and economic analysis. Life Cycle Analysis (LCA) methodology has been used for the quantification and evaluation of the impacts of the chromium tanning process as a basis to propose further improvement actions. Four main subsystems were considered: beamhouse, tanyard, retanning, and wood furnace. Damages to human health, ecosystem quality, and resources are mainly produced by the tanyard subsystem. The control and reduction of chromium and ammonia emissions are the critical points to be considered to improve the environmental performance of the process. Technologies available for improved management of chromium tanning were profoundly studied, and improvement actions related to optimized operational conditions and a high exhaustion chrome-tanning process were selected. These actions related to the implementation of internal procedures affected the economy of the process with savings ranging from US dollars 8.63 to US dollars 22.5 for the processing of 1 ton of wet salt hides, meanwhile the global environmental impact was reduced to 44-50%. Moreover, the treatment of wastewaters was considered in two scenarios. Primary treatment presented the largest reduction of the environmental impact of the tanning process, while no significant improvement for the evaluated impact categories was achieved when combining primary and secondary treatments.

  8. Interdisciplinary evaluation of dysphagia: clinical swallowing evaluation and videoendoscopy of swallowing.

    PubMed

    Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite

    2009-01-01

    Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.

  9. Influence of processing in mercury and selenium vapor on the electrical properties of Cd /SUB x/ Hg /SUB 1-x/ Se, Zn /SUB x/ Hg /SUB 1-x/ Se solid solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavaleshko, N.P.; Khomyak, V.V.; Makogonenko, V.N.

    1985-12-01

    In order to determine the predominant intrinsic point defects in Cd /SUB x/ Hg /SUB 1-x/ Se and Zn /SUB x/ Hg /SUB 1-x/ Se solid solutions, the authors study the influence of annealing in mercury and selenium vapor on the carrier concentration and mobility. When the specimens are annealed in selenium vapor the electron concentration at first increases and then becomes constant. A theoretical analysis of the results obtained indicate that selenium vacancies are the predominant point defects in the solutions, and that the process of defect formation itself is quasiepitaxial.

  10. Melting processes of oligomeric α and β isotactic polypropylene crystals at ultrafast heating rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji, Xiaojing; He, Xuehao, E-mail: xhhe@tju.edu.cn, E-mail: scjiang@tju.edu.cn; Jiang, Shichun, E-mail: xhhe@tju.edu.cn, E-mail: scjiang@tju.edu.cn

    The melting behaviors of α (stable) and β (metastable) isotactic polypropylene (iPP) crystals at ultrafast heating rates are simulated with atomistic molecular dynamics method. Quantitative information about the melting processes of α- and β-iPP crystals at atomistic level is achieved. The result shows that the melting process starts from the interfaces of lamellar crystal through random dislocation of iPP chains along the perpendicular direction of lamellar crystal structure. In the melting process, the lamellar crystal gradually expands but the corresponding thickness decreases. The analysis shows that the system expansion lags behind the crystallinity decreasing and the lagging extents for α-more » and β-iPP are significantly different. The apparent melting points of α- and β-iPP crystals rise with the increase of the heating rate and lamellar crystal thickness. The apparent melting point of α-iPP crystal is always higher than that of β-iPP at differently heating rates. Applying the Gibbs-Thomson rule and the scaling property of the melting kinetics, the equilibrium melting points of perfect α- and β-iPP crystals are finally predicted and it shows a good agreement with experimental result.« less

  11. Theoretical study of the accuracy of the pulse method, frontal analysis, and frontal analysis by characteristic points for the determination of single component adsorption isotherms.

    PubMed

    Andrzejewska, Anna; Kaczmarski, Krzysztof; Guiochon, Georges

    2009-02-13

    The adsorption isotherms of selected compounds are our main source of information on the mechanisms of adsorption processes. Thus, the selection of the methods used to determine adsorption isotherm data and to evaluate the errors made is critical. Three chromatographic methods were evaluated, frontal analysis (FA), frontal analysis by characteristic point (FACP), and the pulse or perturbation method (PM), and their accuracies were compared. Using the equilibrium-dispersive (ED) model of chromatography, breakthrough curves of single components were generated corresponding to three different adsorption isotherm models: the Langmuir, the bi-Langmuir, and the Moreau isotherms. For each breakthrough curve, the best conventional procedures of each method (FA, FACP, PM) were used to calculate the corresponding data point, using typical values of the parameters of each isotherm model, for four different values of the column efficiency (N=500, 1000, 2000, and 10,000). Then, the data points were fitted to each isotherm model and the corresponding isotherm parameters were compared to those of the initial isotherm model. When isotherm data are derived with a chromatographic method, they may suffer from two types of errors: (1) the errors made in deriving the experimental data points from the chromatographic records; (2) the errors made in selecting an incorrect isotherm model and fitting to it the experimental data. Both errors decrease significantly with increasing column efficiency with FA and FACP, but not with PM.

  12. Construction of Gallium Point at NMIJ

    NASA Astrophysics Data System (ADS)

    Widiatmo, J. V.; Saito, I.; Yamazawa, K.

    2017-03-01

    Two open-type gallium point cells were fabricated using ingots whose nominal purities are 7N. Measurement systems for the realization of the melting point of gallium using these cells were built. The melting point of gallium is repeatedly realized by means of the measurement systems for evaluating the repeatability. Measurements for evaluating the effect of hydrostatic pressure coming from the molten gallium existing during the melting process and the effect of gas pressure that fills the cell were also performed. Direct cell comparisons between those cells were conducted. This comparison was aimed to evaluate the consistency of each cell, especially related to the nominal purity. Direct cell comparison between the open-type and the sealed-type gallium point cell was also conducted. Chemical analysis was conducted using samples extracted from ingots used in both the newly built open-type gallium point cells, from which the effect of impurities in the ingot was evaluated.

  13. Marked point process for modelling seismic activity (case study in Sumatra and Java)

    NASA Astrophysics Data System (ADS)

    Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.

    2018-05-01

    Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.

  14. [Applications and prospects of on-line near infrared spectroscopy technology in manufacturing of Chinese materia medica].

    PubMed

    Li, Yang; Wu, Zhi-Sheng; Pan, Xiao-Ning; Shi, Xin-Yuan; Guo, Ming-Ye; Xu, Bing; Qiao, Yan-Jiang

    2014-10-01

    The quality of Chinese materia medica (CMM) is affected by every process in CMM manufacturing. According to multi-unit complex features in the production of CMM, on-line near infrared spectroscopy (NIR) is used as an evaluating technology with its rapid, non-destructive and non-pollution etc. advantages. With the research in institutions, the on-line NIR applied in process analysis and control of CMM was described systematically, and the on-line NIR platform building was used as an example to clarify the feasibility of on-line NIR technology in CMM manufacturing process. Then, from the point of application by pharmaceutical companies, the current on-line NIR research on CMM and its production in pharmaceutical companies was relatively comprehensively summarized. Meanwhile, the types of CMM productions were classified in accordance with two formulations (liquid and solid dosage formulations). The different production processes (extraction, concentration and alcohol precipitation, etc. ) were used as liquid formulation diacritical points; the different types (tablets, capsules and plasters, etc.) were used as solid dosage formulation diacritical points, and the reliability of on-line NIR used in the whole process in CMM production was proved in according to the summary of literatures in recent 10 years, which could support the modernization of CMM production.

  15. The Use of Computer Vision Algorithms for Automatic Orientation of Terrestrial Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Markiewicz, Jakub Stefan

    2016-06-01

    The paper presents analysis of the orientation of terrestrial laser scanning (TLS) data. In the proposed data processing methodology, point clouds are considered as panoramic images enriched by the depth map. Computer vision (CV) algorithms are used for orientation, which are applied for testing the correctness of the detection of tie points and time of computations, and for assessing difficulties in their implementation. The BRISK, FASRT, MSER, SIFT, SURF, ASIFT and CenSurE algorithms are used to search for key-points. The source data are point clouds acquired using a Z+F 5006h terrestrial laser scanner on the ruins of Iłża Castle, Poland. Algorithms allowing combination of the photogrammetric and CV approaches are also presented.

  16. Factors affecting and affected by user acceptance of computer-based nursing documentation: results of a two-year study.

    PubMed

    Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald

    2003-01-01

    The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.

  17. Inference from clustering with application to gene-expression microarrays.

    PubMed

    Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M

    2002-01-01

    There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.

  18. Forest understory trees can be segmented accurately within sufficiently dense airborne laser scanning point clouds.

    PubMed

    Hamraz, Hamid; Contreras, Marco A; Zhang, Jun

    2017-07-28

    Airborne laser scanning (LiDAR) point clouds over large forested areas can be processed to segment individual trees and subsequently extract tree-level information. Existing segmentation procedures typically detect more than 90% of overstory trees, yet they barely detect 60% of understory trees because of the occlusion effect of higher canopy layers. Although understory trees provide limited financial value, they are an essential component of ecosystem functioning by offering habitat for numerous wildlife species and influencing stand development. Here we model the occlusion effect in terms of point density. We estimate the fractions of points representing different canopy layers (one overstory and multiple understory) and also pinpoint the required density for reasonable tree segmentation (where accuracy plateaus). We show that at a density of ~170 pt/m² understory trees can likely be segmented as accurately as overstory trees. Given the advancements of LiDAR sensor technology, point clouds will affordably reach this required density. Using modern computational approaches for big data, the denser point clouds can efficiently be processed to ultimately allow accurate remote quantification of forest resources. The methodology can also be adopted for other similar remote sensing or advanced imaging applications such as geological subsurface modelling or biomedical tissue analysis.

  19. Profitability Analysis of Soybean Oil Processes.

    PubMed

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  20. Profitability Analysis of Soybean Oil Processes

    PubMed Central

    2017-01-01

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production. PMID:28991168

  1. Strengths and weaknesses of temporal stability analysis for monitoring and estimating grid-mean soil moisture in a high-intensity irrigated agricultural landscape

    NASA Astrophysics Data System (ADS)

    Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.

    2017-01-01

    Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.

  2. Direction of CRT waste glass processing: Electronics recycling industry communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Julia R., E-mail: mueller.143@osu.edu; Boehm, Michael W.; Drummond, Charles

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer Given a large flow rate of CRT glass {approx}10% of the panel glass stream will be leaded. Black-Right-Pointing-Pointer The supply of CRT waste glass exceeded demand in 2009. Black-Right-Pointing-Pointer Recyclers should use UV-light to detect lead oxide during the separation process. Black-Right-Pointing-Pointer Recycling market analysis techniques and results are given for CRT glass. Black-Right-Pointing-Pointer Academic initiatives and the necessary expansion of novel product markets are discussed. - Abstract: Cathode Ray Tube, CRT, waste glass recycling has plagued glass manufacturers, electronics recyclers and electronics waste policy makers for decades because the total supply of waste glass exceeds demand, andmore » the formulations of CRT glass are ill suited for most reuse options. The solutions are to separate the undesirable components (e.g. lead oxide) in the waste and create demand for new products. Achieving this is no simple feat, however, as there are many obstacles: limited knowledge of waste glass composition; limited automation in the recycling process; transportation of recycled material; and a weak and underdeveloped market. Thus one of the main goals of this paper is to advise electronic glass recyclers on how to best manage a diverse supply of glass waste and successfully market to end users. Further, this paper offers future directions for academic and industry research. To develop the recommendations offered here, a combination of approaches were used: (1) a thorough study of historic trends in CRT glass chemistry; (2) bulk glass collection and analysis of cullet from a large-scale glass recycler; (3) conversations with industry members and a review of potential applications; and (4) evaluation of the economic viability of specific uses for recycled CRT glass. If academia and industry can solve these problems (for example by creating a database of composition organized by manufacturer and glass source) then the reuse of CRT glass can be increased.« less

  3. Wavelet analysis of the impedance cardiogram waveforms

    NASA Astrophysics Data System (ADS)

    Podtaev, S.; Stepanov, R.; Dumler, A.; Chugainov, S.; Tziberkin, K.

    2012-12-01

    Impedance cardiography has been used for diagnosing atrial and ventricular dysfunctions, valve disorders, aortic stenosis, and vascular diseases. Almost all the applications of impedance cardiography require determination of some of the characteristic points of the ICG waveform. The ICG waveform has a set of characteristic points known as A, B, E ((dZ/dt)max) X, Y, O and Z. These points are related to distinct physiological events in the cardiac cycle. Objective of this work is an approbation of a new method of processing and interpretation of the impedance cardiogram waveforms using wavelet analysis. A method of computer thoracic tetrapolar polyrheocardiography is used for hemodynamic registrations. Use of original wavelet differentiation algorithm allows combining filtration and calculation of the derivatives of rheocardiogram. The proposed approach can be used in clinical practice for early diagnostics of cardiovascular system remodelling in the course of different pathologies.

  4. Preliminary Design and Analysis of the GIFTS Instrument Pointing System

    NASA Technical Reports Server (NTRS)

    Zomkowski, Paul P.

    2003-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Instrument is the next generation spectrometer for remote sensing weather satellites. The GIFTS instrument will be used to perform scans of the Earth s atmosphere by assembling a series of field-of- views (FOV) into a larger pattern. Realization of this process is achieved by step scanning the instrument FOV in a contiguous fashion across any desired portion of the visible Earth. A 2.3 arc second pointing stability, with respect to the scanning instrument, must be maintained for the duration of the FOV scan. A star tracker producing attitude data at 100 Hz rate will be used by the autonomous pointing algorithm to precisely track target FOV s on the surface of the Earth. The main objective is to validate the pointing algorithm in the presence of spacecraft disturbances and determine acceptable disturbance limits from expected noise sources. Proof of concept validation of the pointing system algorithm is carried out with a full system simulation developed using Matlab Simulink. Models for the following components function within the full system simulation: inertial reference unit (IRU), attitude control system (ACS), reaction wheels, star tracker, and mirror controller. With the spacecraft orbital position and attitude maintained to within specified limits the pointing algorithm receives quaternion, ephemeris, and initialization data that are used to construct the required mirror pointing commands at a 100 Hz rate. This comprehensive simulation will also aid in obtaining a thorough understanding of spacecraft disturbances and other sources of pointing system errors. Parameter sensitivity studies and disturbance analysis will be used to obtain limits of operability for the GIFTS instrument. The culmination of this simulation development and analysis will be used to validate the specified performance requirements outlined for this instrument.

  5. Organic Versus Contractor Logistics Support For Depot-Level Repair: Factors That Drive Sub-Optimal Decisions

    DTIC Science & Technology

    2016-02-16

    Considerations in Using CLS or Organic Support Break-Even Analysis in the Decision Process When a business decision is made in an ideal environment, all costs...Line B). The break-even point (Point C) is the production quantity where the advantage moves to a different cost curve. For a business decision...the Services to provide regular reporting to them on contractor versus organic workload and money .1415 In sum, there are laws that mandate 50/50

  6. The Laplace method for probability measures in Banach spaces

    NASA Astrophysics Data System (ADS)

    Piterbarg, V. I.; Fatalov, V. R.

    1995-12-01

    Contents §1. Introduction Chapter I. Asymptotic analysis of continual integrals in Banach space, depending on a large parameter §2. The large deviation principle and logarithmic asymptotics of continual integrals §3. Exact asymptotics of Gaussian integrals in Banach spaces: the Laplace method 3.1. The Laplace method for Gaussian integrals taken over the whole Hilbert space: isolated minimum points ([167], I) 3.2. The Laplace method for Gaussian integrals in Hilbert space: the manifold of minimum points ([167], II) 3.3. The Laplace method for Gaussian integrals in Banach space ([90], [174], [176]) 3.4. Exact asymptotics of large deviations of Gaussian norms §4. The Laplace method for distributions of sums of independent random elements with values in Banach space 4.1. The case of a non-degenerate minimum point ([137], I) 4.2. A degenerate isolated minimum point and the manifold of minimum points ([137], II) §5. Further examples 5.1. The Laplace method for the local time functional of a Markov symmetric process ([217]) 5.2. The Laplace method for diffusion processes, a finite number of non-degenerate minimum points ([116]) 5.3. Asymptotics of large deviations for Brownian motion in the Hölder norm 5.4. Non-asymptotic expansion of a strong stable law in Hilbert space ([41]) Chapter II. The double sum method - a version of the Laplace method in the space of continuous functions §6. Pickands' method of double sums 6.1. General situations 6.2. Asymptotics of the distribution of the maximum of a Gaussian stationary process 6.3. Asymptotics of the probability of a large excursion of a Gaussian non-stationary process §7. Probabilities of large deviations of trajectories of Gaussian fields 7.1. Homogeneous fields and fields with constant dispersion 7.2. Finitely many maximum points of dispersion 7.3. Manifold of maximum points of dispersion 7.4. Asymptotics of distributions of maxima of Wiener fields §8. Exact asymptotics of large deviations of the norm of Gaussian vectors and processes with values in the spaces L_k^p and l^2. Gaussian fields with the set of parameters in Hilbert space 8.1 Exact asymptotics of the distribution of the l_k^p-norm of a Gaussian finite-dimensional vector with dependent coordinates, p > 1 8.2. Exact asymptotics of probabilities of high excursions of trajectories of processes of type \\chi^2 8.3. Asymptotics of the probabilities of large deviations of Gaussian processes with a set of parameters in Hilbert space [74] 8.4. Asymptotics of distributions of maxima of the norms of l^2-valued Gaussian processes 8.5. Exact asymptotics of large deviations for the l^2-valued Ornstein-Uhlenbeck process Bibliography

  7. Viewpoints on Medical Image Processing: From Science to Application

    PubMed Central

    Deserno (né Lehmann), Thomas M.; Handels, Heinz; Maier-Hein (né Fritzsche), Klaus H.; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas

    2013-01-01

    Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment. PMID:24078804

  8. Viewpoints on Medical Image Processing: From Science to Application.

    PubMed

    Deserno Né Lehmann, Thomas M; Handels, Heinz; Maier-Hein Né Fritzsche, Klaus H; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas

    2013-05-01

    Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment.

  9. Fixed point theorems of GPS carrier phase ambiguity resolution and their application to massive network processing: Ambizap

    NASA Astrophysics Data System (ADS)

    Blewitt, Geoffrey

    2008-12-01

    Precise point positioning (PPP) has become popular for Global Positioning System (GPS) geodetic network analysis because for n stations, PPP has O(n) processing time, yet solutions closely approximate those of O(n3) full network analysis. Subsequent carrier phase ambiguity resolution (AR) further improves PPP precision and accuracy; however, full-network bootstrapping AR algorithms are O(n4), limiting single network solutions to n < 100. In this contribution, fixed point theorems of AR are derived and then used to develop "Ambizap," an O(n) algorithm designed to give results that closely approximate full network AR. Ambizap has been tested to n ≈ 2800 and proves to be O(n) in this range, adding only ˜50% to PPP processing time. Tests show that a 98-station network is resolved on a 3-GHz CPU in 7 min, versus 22 h using O(n4) AR methods. Ambizap features a novel network adjustment filter, producing solutions that precisely match O(n4) full network analysis. The resulting coordinates agree to ≪1 mm with current AR methods, much smaller than the ˜3-mm RMS precision of PPP alone. A 2000-station global network can be ambiguity resolved in ˜2.5 h. Together with PPP, Ambizap enables rapid, multiple reanalysis of large networks (e.g., ˜1000-station EarthScope Plate Boundary Observatory) and facilitates the addition of extra stations to an existing network solution without need to reprocess all data. To meet future needs, PPP plus Ambizap is designed to handle ˜10,000 stations per day on a 3-GHz dual-CPU desktop PC.

  10. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.

  11. Surface inspection: Research and development

    NASA Technical Reports Server (NTRS)

    Batchelder, J. S.

    1987-01-01

    Surface inspection techniques are used for process learning, quality verification, and postmortem analysis in manufacturing for a spectrum of disciplines. First, trends in surface analysis are summarized for integrated circuits, high density interconnection boards, and magnetic disks, emphasizing on-line applications as opposed to off-line or development techniques. Then, a closer look is taken at microcontamination detection from both a patterned defect and a particulate inspection point of view.

  12. Conducting Qualitative Data Analysis: Reading Line-by-Line, but Analyzing by Meaningful Qualitative Units

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2012-01-01

    In the first of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail points out the challenges of determining units to analyze qualitatively when dealing with text. He acknowledges that although we may read a document word-by-word or line-by-line, we need to adjust our focus when processing the text for purposes of…

  13. [Demographic processes in the countries of Eastern Europe 1945-1990].

    PubMed

    Shchepin, O P; Vladimirova, L I

    1990-01-01

    An analysis is made of changes in the demographic processes in the countries of Eastern Europe over the period from 1945 to 1990 within both the general regularities and national peculiarities according to the parameters of statics and dynamics of population movement. The positive tendencies in the demographic processes are pointed out, first of all in infant mortality rates and mean expectation of life at birth in Eastern European countries by decades reflecting the peculiarities of changes as compared with developed countries.

  14. Digital signal processing and control and estimation theory -- Points of tangency, area of intersection, and parallel directions

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1976-01-01

    A number of current research directions in the fields of digital signal processing and modern control and estimation theory were studied. Topics such as stability theory, linear prediction and parameter identification, system analysis and implementation, two-dimensional filtering, decentralized control and estimation, image processing, and nonlinear system theory were examined in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the two disciplines. An extensive bibliography is included.

  15. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  16. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    NASA Astrophysics Data System (ADS)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  17. Automatic Detection and Classification of Pole-Like Objects for Urban Cartography Using Mobile Laser Scanning Data

    PubMed Central

    Ordóñez, Celestino; Cabo, Carlos; Sanz-Ablanedo, Enoc

    2017-01-01

    Mobile laser scanning (MLS) is a modern and powerful technology capable of obtaining massive point clouds of objects in a short period of time. Although this technology is nowadays being widely applied in urban cartography and 3D city modelling, it has some drawbacks that need to be avoided in order to strengthen it. One of the most important shortcomings of MLS data is concerned with the fact that it provides an unstructured dataset whose processing is very time-consuming. Consequently, there is a growing interest in developing algorithms for the automatic extraction of useful information from MLS point clouds. This work is focused on establishing a methodology and developing an algorithm to detect pole-like objects and classify them into several categories using MLS datasets. The developed procedure starts with the discretization of the point cloud by means of a voxelization, in order to simplify and reduce the processing time in the segmentation process. In turn, a heuristic segmentation algorithm was developed to detect pole-like objects in the MLS point cloud. Finally, two supervised classification algorithms, linear discriminant analysis and support vector machines, were used to distinguish between the different types of poles in the point cloud. The predictors are the principal component eigenvalues obtained from the Cartesian coordinates of the laser points, the range of the Z coordinate, and some shape-related indexes. The performance of the method was tested in an urban area with 123 poles of different categories. Very encouraging results were obtained, since the accuracy rate was over 90%. PMID:28640189

  18. Nonlinear breakup of liquid sheets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jazayeri, S.A.; Li, X.

    1997-07-01

    Sprays formed from the disintegration of liquid sheets have extensive practical applications, ranging from chemical and pharmaceutical processes to power generation and propulsion systems. A knowledge of the liquid sheet breakup process is essential to the understanding of fundamental mechanism of liquid atomization and spray formation processes. The breakup of liquid sheets has been investigated in terms of hydrodynamic stability via linear analysis by Squire, Hagerty and Shea, Li, etc. nonlinear effect has been studied by Clark and Dombrowski up to the second order, and by Rangel and Sirignano through numerical simulation employing vortex discretization method. As shown by Taubmore » for the breakup of circular liquid jets, the closer to the breakup region, the higher the order of nonlinear analysis has to be for adequate description of the breakup behavior. As pointed out by Bogy, a nonlinear analysis up to the third order is generally sufficient to account for the inherent nonlinear nature of the breakup process. Therefore, a third-order nonlinear analysis has been carried out in this study to investigate the process of liquid sheet disruption preceding the spray formation.« less

  19. Signal Processing in Periodically Forced Gradient Frequency Neural Networks

    PubMed Central

    Kim, Ji Chul; Large, Edward W.

    2015-01-01

    Oscillatory instability at the Hopf bifurcation is a dynamical phenomenon that has been suggested to characterize active non-linear processes observed in the auditory system. Networks of oscillators poised near Hopf bifurcation points and tuned to tonotopically distributed frequencies have been used as models of auditory processing at various levels, but systematic investigation of the dynamical properties of such oscillatory networks is still lacking. Here we provide a dynamical systems analysis of a canonical model for gradient frequency neural networks driven by a periodic signal. We use linear stability analysis to identify various driven behaviors of canonical oscillators for all possible ranges of model and forcing parameters. The analysis shows that canonical oscillators exhibit qualitatively different sets of driven states and transitions for different regimes of model parameters. We classify the parameter regimes into four main categories based on their distinct signal processing capabilities. This analysis will lead to deeper understanding of the diverse behaviors of neural systems under periodic forcing and can inform the design of oscillatory network models of auditory signal processing. PMID:26733858

  20. Transcriptome profiling reveals regulatory mechanisms underlying Corolla Senescence in Petunia

    USDA-ARS?s Scientific Manuscript database

    Genetic regulatory mechanisms that govern petal natural senescence in petunia is complicated and unclear. To identify key genes and pathways that regulate the process, we initiated a transcriptome analysis in petunia petals at four developmental time points, including petal opening without anthesis ...

  1. Improving Food Safety in Meat and Poultry: Will New Regulations Benefit Consumers?

    ERIC Educational Resources Information Center

    Unnevehr, Laurian J.; Roberts, Tanya; Jensen, Helen H.

    1997-01-01

    The U.S. Department of Agriculture's Hazard Analysis and Critical Control Point System for meat and poultry processing will benefit consumers by reducing food-borne illnesses. The benefits are likely to exceed the additional costs from implementing the regulations. (SK)

  2. Research on Teaching in Physical Education: Questions and Comments.

    ERIC Educational Resources Information Center

    Lee, Amelia M.

    1991-01-01

    Reinforces some of the points made in Stephen Silverman's research review on teaching in physical education, examining the process-product paradigm, the measurement of learning and teaching, and the significance of student mediation. The article identifies issues that merit further discussion and analysis. (SM)

  3. Weighted gene co-expression network analysis reveals potential genes involved in early metamorphosis process in sea cucumber Apostichopus japonicus.

    PubMed

    Li, Yongxin; Kikuchi, Mani; Li, Xueyan; Gao, Qionghua; Xiong, Zijun; Ren, Yandong; Zhao, Ruoping; Mao, Bingyu; Kondo, Mariko; Irie, Naoki; Wang, Wen

    2018-01-01

    Sea cucumbers, one main class of Echinoderms, have a very fast and drastic metamorphosis process during their development. However, the molecular basis under this process remains largely unknown. Here we systematically examined the gene expression profiles of Japanese common sea cucumber (Apostichopus japonicus) for the first time by RNA sequencing across 16 developmental time points from fertilized egg to juvenile stage. Based on the weighted gene co-expression network analysis (WGCNA), we identified 21 modules. Among them, MEdarkmagenta was highly expressed and correlated with the early metamorphosis process from late auricularia to doliolaria larva. Furthermore, gene enrichment and differentially expressed gene analysis identified several genes in the module that may play key roles in the metamorphosis process. Our results not only provide a molecular basis for experimentally studying the development and morphological complexity of sea cucumber, but also lay a foundation for improving its emergence rate. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Diffusion tensor driven contour closing for cell microinjection targeting.

    PubMed

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2010-01-01

    This article introduces a novel approach to robust automatic detection of unstained living cells in bright-field (BF) microscope images with the goal of producing a target list for an automated microinjection system. The overall image analysis process is described and includes: preprocessing, ridge enhancement, image segmentation, shape analysis and injection point definition. The developed algorithm implements a new version of anisotropic contour completion (ACC) based on the partial differential equation (PDE) for heat diffusion which improves the cell segmentation process by elongating the edges only along their tangent direction. The developed ACC algorithm is equivalent to a dilation of the binary edge image with a continuous elliptic structural element that takes into account local orientation of the contours preventing extension towards normal direction. Experiments carried out on real images of 10 to 50 microm CHO-K1 adherent cells show a remarkable reliability in the algorithm along with up to 85% success for cell detection and injection point definition.

  5. Quality Management Framework for Total Diet Study centres in Europe.

    PubMed

    Pité, Marina; Pinchen, Hannah; Castanheira, Isabel; Oliveira, Luisa; Roe, Mark; Ruprich, Jiri; Rehurkova, Irena; Sirot, Veronique; Papadopoulos, Alexandra; Gunnlaugsdóttir, Helga; Reykdal, Ólafur; Lindtner, Oliver; Ritvanen, Tiina; Finglas, Paul

    2018-02-01

    A Quality Management Framework to improve quality and harmonization of Total Diet Study practices in Europe was developed within the TDS-Exposure Project. Seventeen processes were identified and hazards, Critical Control Points and associated preventive and corrective measures described. The Total Diet Study process was summarized in a flowchart divided into planning and practical (sample collection, preparation and analysis; risk assessment analysis and publication) phases. Standard Operating Procedures were developed and implemented in pilot studies in five organizations. The flowchart was used to develop a quality framework for Total Diet Studies that could be included in formal quality management systems. Pilot studies operated by four project partners were visited by project assessors who reviewed implementation of the proposed framework and identified areas that could be improved. The quality framework developed can be the starting point for any Total Diet Study centre and can be used within existing formal quality management approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Determination Of Slitting Criterion Parameter During The Multi Slit Rolling Process

    NASA Astrophysics Data System (ADS)

    Stefanik, Andrzej; Mróz, Sebastian; Szota, Piotr; Dyja, Henryk

    2007-05-01

    The rolling of rods with slitting of the strip calls for the use of special mathematical models that would allow for the separating of metal. A theoretical analysis of the effect of the gap of slitting rollers on the process of band slitting during the rolling of 20 mm and 16 mm-diameter ribbed rods rolled according to the two-strand technology was carried out within this study. For the numerical modeling of strip slitting the Forge3® computer program was applied. The strip slitting in the simulation is implemented by the algorithm of removing elements in which the critical value of the normalized Cockroft - Latham criterion has been exceeded. To determine the value of the criterion the inverse method was applied. Distance between a point, where crack begins, and point of contact metal with the slitting rollers was the parameter for analysis. Power and rolling torque during slit rolling were presented. Distribution and change of the stress in strand while slitting were presented.

  7. Martingales, detrending data, and the efficient market hypothesis

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.; Bassler, Kevin E.; Gunaratne, Gemunu H.

    2008-01-01

    We discuss martingales, detrending data, and the efficient market hypothesis (EMH) for stochastic processes x( t) with arbitrary diffusion coefficients D( x, t). Beginning with x-independent drift coefficients R( t) we show that martingale stochastic processes generate uncorrelated, generally non-stationary increments. Generally, a test for a martingale is therefore a test for uncorrelated increments. A detrended process with an x-dependent drift coefficient is generally not a martingale, and so we extend our analysis to include the class of ( x, t)-dependent drift coefficients of interest in finance. We explain why martingales look Markovian at the level of both simple averages and 2-point correlations. And while a Markovian market has no memory to exploit and presumably cannot be beaten systematically, it has never been shown that martingale memory cannot be exploited in 3-point or higher correlations to beat the market. We generalize our Markov scaling solutions presented earlier, and also generalize the martingale formulation of the EMH to include ( x, t)-dependent drift in log returns. We also use the analysis of this paper to correct a misstatement of the ‘fair game’ condition in terms of serial correlations in Fama's paper on the EMH. We end with a discussion of Levy's characterization of Brownian motion and prove that an arbitrary martingale is topologically inequivalent to a Wiener process.

  8. A required course in the development, implementation, and evaluation of clinical pharmacy services.

    PubMed

    Skomo, Monica L; Kamal, Khalid M; Berdine, Hildegarde J

    2008-10-15

    To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care.

  9. A Required Course in the Development, Implementation, and Evaluation of Clinical Pharmacy Services

    PubMed Central

    Kamal, Khalid M.; Berdine, Hildegarde J.

    2008-01-01

    Objective To develop, implement, and assess a required pharmacy practice course to prepare pharmacy students to develop, implement, and evaluate clinical pharmacy services using a business plan model. Design Course content centered around the process of business planning and pharmacoeconomic evaluations. Selected business planning topics included literature evaluation, mission statement development, market evaluation, policy and procedure development, and marketing strategy. Selected pharmacoeconomic topics included cost-minimization analysis, cost-benefit analysis, cost-effectiveness analysis, cost-utility analysis, and health-related quality of life (HRQoL). Assessment methods included objective examinations, student participation, performance on a group project, and peer evaluation. Assessment One hundred fifty-three students were enrolled in the course. The mean scores on the objective examinations (100 points per examination) ranged from 82 to 85 points, with 25%-35% of students in the class scoring over 90, and 40%-50% of students scoring from 80 to 89. The mean scores on the group project (200 points) and classroom participation (50 points) were 183.5 and 46.1, respectively. The mean score on the peer evaluation was 30.8, with scores ranging from 27.5 to 31.7. Conclusion The course provided pharmacy students with the framework necessary to develop and implement evidence-based disease management programs and to assure efficient, cost-effective utilization of pertinent resources in the provision of patient care. PMID:19214263

  10. Performance Analysis of Entropy Methods on K Means in Clustering Process

    NASA Astrophysics Data System (ADS)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  11. Influence of different base thicknesses on maxillary complete denture processing: linear and angular graphic analysis on the movement of artificial teeth.

    PubMed

    Mazaro, José Vitor Quinelli; Gennari Filho, Humberto; Vedovatto, Eduardo; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza; Zavanelli, Adriana Cristina

    2011-09-01

    The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization. A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc). Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric. All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.

  12. Point process analysis of noise in early invertebrate vision

    PubMed Central

    Vinnicombe, Glenn

    2017-01-01

    Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps). Here, randomness of both the photon inputs (regarded as extrinsic noise) and the conversion process (intrinsic noise) are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder) filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter) and shape (amplitude and width) variance, it is the mean delay that is critical to noise performance. As the timeliness of visual information is important for real-time action, this delay could potentially limit the speed at which invertebrates can respond to stimuli. Consequently, if one wants to increase visual fidelity, reducing the photoconversion lag is much more important than improving the regularity of the electrical signal. PMID:29077703

  13. Network analysis reveals stage-specific changes in zebrafish embryo development using time course whole transcriptome profiling and prior biological knowledge.

    PubMed

    Zhang, Yuji

    2015-01-01

    Molecular networks act as the backbone of molecular activities within cells, offering a unique opportunity to better understand the mechanism of diseases. While network data usually constitute only static network maps, integrating them with time course gene expression information can provide clues to the dynamic features of these networks and unravel the mechanistic driver genes characterizing cellular responses. Time course gene expression data allow us to broadly "watch" the dynamics of the system. However, one challenge in the analysis of such data is to establish and characterize the interplay among genes that are altered at different time points in the context of a biological process or functional category. Integrative analysis of these data sources will lead us a more complete understanding of how biological entities (e.g., genes and proteins) coordinately perform their biological functions in biological systems. In this paper, we introduced a novel network-based approach to extract functional knowledge from time-dependent biological processes at a system level using time course mRNA sequencing data in zebrafish embryo development. The proposed method was applied to investigate 1α, 25(OH)2D3-altered mechanisms in zebrafish embryo development. We applied the proposed method to a public zebrafish time course mRNA-Seq dataset, containing two different treatments along four time points. We constructed networks between gene ontology biological process categories, which were enriched in differential expressed genes between consecutive time points and different conditions. The temporal propagation of 1α, 25-Dihydroxyvitamin D3-altered transcriptional changes started from a few genes that were altered initially at earlier stage, to large groups of biological coherent genes at later stages. The most notable biological processes included neuronal and retinal development and generalized stress response. In addition, we also investigated the relationship among biological processes enriched in co-expressed genes under different conditions. The enriched biological processes include translation elongation, nucleosome assembly, and retina development. These network dynamics provide new insights into the impact of 1α, 25-Dihydroxyvitamin D3 treatment in bone and cartilage development. We developed a network-based approach to analyzing the DEGs at different time points by integrating molecular interactions and gene ontology information. These results demonstrate that the proposed approach can provide insight on the molecular mechanisms taking place in vertebrate embryo development upon treatment with 1α, 25(OH)2D3. Our approach enables the monitoring of biological processes that can serve as a basis for generating new testable hypotheses. Such network-based integration approach can be easily extended to any temporal- or condition-dependent genomic data analyses.

  14. Geospatial Field Methods: An Undergraduate Course Built Around Point Cloud Construction and Analysis to Promote Spatial Learning and Use of Emerging Technology in Geoscience

    NASA Astrophysics Data System (ADS)

    Bunds, M. P.

    2017-12-01

    Point clouds are a powerful data source in the geosciences, and the emergence of structure-from-motion (SfM) photogrammetric techniques has allowed them to be generated quickly and inexpensively. Consequently, applications of them as well as methods to generate, manipulate, and analyze them warrant inclusion in undergraduate curriculum. In a new course called Geospatial Field Methods at Utah Valley University, students in small groups use SfM to generate a point cloud from imagery collected with a small unmanned aerial system (sUAS) and use it as a primary data source for a research project. Before creating their point clouds, students develop needed technical skills in laboratory and class activities. The students then apply the skills to construct the point clouds, and the research projects and point cloud construction serve as a central theme for the class. Intended student outcomes for the class include: technical skills related to acquiring, processing, and analyzing geospatial data; improved ability to carry out a research project; and increased knowledge related to their specific project. To construct the point clouds, students first plan their field work by outlining the field site, identifying locations for ground control points (GCPs), and loading them onto a handheld GPS for use in the field. They also estimate sUAS flight elevation, speed, and the flight path grid spacing required to produce a point cloud with the resolution required for their project goals. In the field, the students place the GCPs using handheld GPS, and survey the GCP locations using post-processed-kinematic (PPK) or real-time-kinematic (RTK) methods. The students pilot the sUAS and operate its camera according to the parameters that they estimated in planning their field work. Data processing includes obtaining accurate locations for the PPK/RTK base station and GCPs, and SfM processing with Agisoft Photoscan. The resulting point clouds are rasterized into digital surface models, assessed for accuracy, and analyzed in Geographic Information System software. Student projects have included mapping and analyzing landslide morphology, fault scarps, and earthquake ground surface rupture. Students have praised the geospatial skills they learn, whereas helping them stay on schedule to finish their projects is a challenge.

  15. Miniaturization for Point-of-Care Analysis: Platform Technology for Almost Every Biomedical Assay.

    PubMed

    Schumacher, Soeren; Sartorius, Dorian; Ehrentreich-Förster, Eva; Bier, Frank F

    2012-10-01

    Platform technologies for the changing need of diagnostics are one of the main challenges in medical device technology. From one point-of-view the demand for new and more versatile diagnostic is increasing due to a deeper knowledge of biomarkers and their combination with diseases. From another point-of-view a decentralization of diagnostics will occur since decisions can be made faster resulting in higher success of therapy. Hence, new types of technologies have to be established which enables a multiparameter analysis at the point-of-care. Within this review-like article a system called Fraunhofer ivD-platform is introduced. It consists of a credit-card sized cartridge with integrated reagents, sensors and pumps and a read-out/processing unit. Within the cartridge the assay runs fully automated within 15-20 minutes. Due to the open design of the platform different analyses such as antibody, serological or DNA-assays can be performed. Specific examples of these three different assay types are given to show the broad applicability of the system.

  16. Identifying and counting point defects in carbon nanotubes.

    PubMed

    Fan, Yuwei; Goldsmith, Brett R; Collins, Philip G

    2005-12-01

    The prevailing conception of carbon nanotubes and particularly single-walled carbon nanotubes (SWNTs) continues to be one of perfectly crystalline wires. Here, we demonstrate a selective electrochemical method that labels point defects and makes them easily visible for quantitative analysis. High-quality SWNTs are confirmed to contain one defect per 4 microm on average, with a distribution weighted towards areas of SWNT curvature. Although this defect density compares favourably to high-quality, silicon single-crystals, the presence of a single defect can have tremendous electronic effects in one-dimensional conductors such as SWNTs. We demonstrate a one-to-one correspondence between chemically active point defects and sites of local electronic sensitivity in SWNT circuits, confirming the expectation that individual defects may be critical to understanding and controlling variability, noise and chemical sensitivity in SWNT electronic devices. By varying the SWNT synthesis technique, we further show that the defect spacing can be varied over orders of magnitude. The ability to detect and analyse point defects, especially at very low concentrations, indicates the promise of this technique for quantitative process analysis, especially in nanoelectronics development.

  17. Analysis of the statistic al properties of pulses in atmospheric corona discharge

    NASA Astrophysics Data System (ADS)

    Aubrecht, L.; Koller, J.; Plocek, J.; Stanék, Z.

    2000-03-01

    The properties of the negative corona current pulses in a single point-to-plane configuration have been extensively studied by many investigators. The amplitude and the interval of these pulses are not generally constant and depend on many variables. The repetition rate and the amplitude of the pulses fluctuate in time. Since these fluctuations are subject to a certain probability distribution, the statistical processing was used for the analysis of the pulse fluctuations. The behavior of the pulses has been also investigated in a multipoint geometry configuration. The dependence of the behavior of the corona pulses on the gap lengths, the material, the shape of the point electrode, the number and separation of electrodes (in the multiple-point mode) has been investigated, too. No detailed study has been carried out up to now for this case. The attention has been devoted also to the study of the pulses on the points of live materials (needles of coniferous trees). This contribution describes recent studies of the statistical properties of the pulses for various conditions.

  18. Spatial pattern analysis of Cu, Zn and Ni and their interpretation in the Campania region (Italy)

    NASA Astrophysics Data System (ADS)

    Petrik, Attila; Albanese, Stefano; Jordan, Gyozo; Rolandi, Roberto; De Vivo, Benedetto

    2017-04-01

    The uniquely abundant Campanian topsoil dataset enabled us to perform a spatial pattern analysis on 3 potentially toxic elements of Cu, Zn and Ni. This study is focusing on revealing the spatial texture and distribution of these elements by spatial point pattern and image processing analysis such as lineament density and spatial variability index calculation. The application of these methods on geochemical data provides a new and efficient tool to understand the spatial variation of concentrations and their background/baseline values. The determination and quantification of spatial variability is crucial to understand how fast the change in concentration is in a certain area and what processes might govern the variation. The spatial variability index calculation and image processing analysis including lineament density enables us to delineate homogenous areas and analyse them with respect to lithology and land use. Identification of spatial outliers and their patterns were also investigated by local spatial autocorrelation and image processing analysis including the determination of local minima and maxima points and singularity index analysis. The spatial variability of Cu and Zn reveals the highest zone (Cu: 0.5 MAD, Zn: 0.8-0.9 MAD, Median Deviation Index) along the coast between Campi Flegrei and the Sorrento Peninsula with the vast majority of statistically identified outliers and high-high spatial clustered points. The background/baseline maps of Cu and Zn reveals a moderate to high variability (Cu: 0.3 MAD, Zn: 0.4-0.5 MAD) NW-SE oriented zone including disrupted patches from Bisaccia to Mignano following the alluvial plains of Appenine's rivers. This zone has high abundance of anomaly concentrations identified using singularity analysis and it also has a high density of lineaments. The spatial variability of Ni shows the highest variability zone (0.6-0.7 MAD) around Campi Flegrei where the majority of low outliers are concentrated. The variability of background/baseline map of Ni reveals a shift to the east in case of highest variability zones coinciding with limestone outcrops. The high segmented area between Mignano and Bisaccia partially follows the alluvial plains of Appenine's rivers which seem to be playing a crucial role in the distribution and redistribution pattern of Cu, Zn and Ni in Campania. The high spatial variability zones of the later elements are located in topsoils on volcanoclastic rocks and are mostly related to cultivation and urbanised areas.

  19. The Complete, Temperature Resolved Spectrum of Methyl Formate Between 214 and 265 GHZ

    NASA Astrophysics Data System (ADS)

    McMillan, James P.; Fortman, Sarah; Neese, Christopher F.; De Lucia, Frank C.

    2015-06-01

    We have studied methyl formate, one of the so-called 'astronomical weeds', in the 214--265 GHz band. We have experimentally gathered a set of intensity calibrated, complete, and temperature resolved spectra from across the astronomically significant temperature range of 248--406 K. Using our previously reported method of analysis, the point by point method, we are capable of generating the complete spectrum at an arbitrary temperature. Thousands of lines, of nontrivial intensity, which were previously not included in the available astrophysical catalogs have been found. The sensitivity of the point by point analysis is such that we are able to identify lines which would not have manifest in a single scan across the band. The consequence has been to reveal not only a number of new methyl formate lines, but also trace amounts of contaminants. We show how the intensities from the contaminants can be removed with indiscernible impact on the signal from methyl formate. To do this we use the point by point results from our previous studies of these contaminants. The efficacy of this process serves as strong proof of concept for usage of our point by point results on the problem of the 'weeds'. The success of this approach for dealing with the weeds has also previously been reported. J.~McMillan, S.~Fortman, C.~Neese, F.~DeLucia, ApJ. 795, 56 (2014) S.~Fortman, J.~McMillan, C.~Neese, S.~Randall, and A.~Remijan, J.~Mol.~Spectrosc. 280, 11 (2012).

  20. When a new technological product launching fails: A multi-method approach of facial recognition and E-WOM sentiment analysis.

    PubMed

    Hernández-Fernández, Dra Asunción; Mora, Elísabet; Vizcaíno Hernández, María Isabel

    2018-04-17

    The dual aim of this research is, firstly, to analyze the physiological and unconscious emotional response of consumers to a new technological product and, secondly, link this emotional response to consumer conscious verbal reports of positive and negative product perceptions. In order to do this, biometrics and self-reported measures of emotional response are combined. On the one hand, a neuromarketing experiment based on the facial recognition of emotions of 10 subjects, when physical attributes and economic information of a technological product are exposed, shows the prevalence of the ambivalent emotion of surprise. On the other hand, a nethnographic qualitative approach of sentiment analysis of 67-user online comments characterise the valence of this emotion as mainly negative in the case and context studied. Theoretical, practical and methodological contributions are anticipated from this paper. From a theoretical point of view this proposal contributes valuable information to the product design process, to an effective development of the marketing mix variables of price and promotion, and to a successful selection of the target market. From a practical point of view, the approach employed in the case study on the product Google Glass provides empirical evidence useful in the decision making process for this and other technological enterprises launching a new product. And from a methodological point of view, the usefulness of integrated neuromarketing-eWOM analysis could contribute to the proliferation of this tandem in marketing research. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. [Comparison of acetonitrile, ethanol and chromatographic column to eliminate high-abundance proteins in human serum].

    PubMed

    Li, Yin; Liao, Ming; He, Xiao; Zhou, Yi; Luo, Rong; Li, Hongtao; Wang, Yun; He, Min

    2012-11-01

    To compare the effects of acetonitrile precipitation, ethanol precipitation and multiple affinity chromatography column Human 14 removal to eliminate high-abundance proteins in human serum. Elimination of serum high-abundance proteins performed with acetonitrile precipitation, ethanol precipitation and multiple affinity chromatography column Human 14 removal. Bis-Tris Mini Gels electrophoresis and two-dimensional gel electrophoresis to detect the effect. Grey value analysis from 1-DE figure showed that after serum processed by acetonitrile method, multiple affinity chromatography column Human 14 removal method and ethanol method, the grey value of albumin changed into 157.2, 40.8 and 8.2 respectively from the original value of 19. 2-DE analysis results indicated that using multiple affinity chromatography column Human 14 method, the protein points noticeable increased by 137 compared to the original serum. After processed by acetonitrile method and ethanol method, the protein point reduced, but the low abundance protein point emerged. The acetonitrile precipitation could eliminate the vast majority of high abundance proteins in serum and gain more proteins of low molecular weight, ethanol precipitation could eliminate part of high abundance proteins in serum, but low abundance proteins less harvested, and multiple affinity chromatography column Human 14 method could effectively removed the high abundance proteins, and keep a large number of low abundance proteins.

  2. Processing and damage recovery of intrinsic self-healing glass fiber reinforced composites

    NASA Astrophysics Data System (ADS)

    Sordo, Federica; Michaud, Véronique

    2016-08-01

    Glass fiber reinforced composites with a self-healing, supramolecular hybrid network matrix were produced using a modified vacuum assisted resin infusion moulding process adapted to high temperature processing. The quality and fiber volume fraction (50%) of the obtained materials were assessed through microscopy and matrix burn-off methods. The thermo-mechanical properties were quantified by means of dynamic mechanical analysis, revealing very high damping properties compared to traditional epoxy-based glass fiber reinforced composites. Self-healing properties were assessed by three-point bending tests. A high recovery of the flexural properties, around 72% for the elastic modulus and 65% of the maximum flexural stress, was achieved after a resting period of 24 h at room temperature. Recovery after low velocity impact events was also visually observed. Applications for this intrinsic and autonomic self-healing highly reinforced composite material point towards semi-structural applications where high damping and/or integrity recovery after impact are required.

  3. Atmospheric Modeling And Sensor Simulation (AMASS) study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1984-01-01

    The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.

  4. Migration and HIV risk: Life histories of Mexican-born men living with HIV in North Carolina

    PubMed Central

    Mann, Lilli; Valera, Erik; Hightow-Weidman, Lisa B.; Barrington, Clare

    2015-01-01

    Latino men in the Southeastern USA are disproportionately affected by HIV, but little is known about how the migration process influences HIV-related risk. In North Carolina (NC), a relatively new immigrant destination, Latino men are predominantly young and from Mexico. We conducted 31 iterative life history interviews with 15 Mexican-born men living with HIV. We used holistic content narrative analysis methods to examine HIV vulnerability in the context of migration and to identify important turning points. Major themes included the prominence of traumatic early life experiences, migration as an ongoing process rather than a finite event, and HIV diagnosis as a final turning point in migration trajectories. Findings provide a nuanced understanding of HIV vulnerability throughout the migration process and have implications including the need for bi-national HIV prevention approaches, improved outreach around early testing and linkage to care, and attention to mental health. PMID:24866206

  5. Mesoscale analysis of failure in quasi-brittle materials: comparison between lattice model and acoustic emission data.

    PubMed

    Grégoire, David; Verdon, Laura; Lefort, Vincent; Grassl, Peter; Saliba, Jacqueline; Regoin, Jean-Pierre; Loukili, Ahmed; Pijaudier-Cabot, Gilles

    2015-10-25

    The purpose of this paper is to analyse the development and the evolution of the fracture process zone during fracture and damage in quasi-brittle materials. A model taking into account the material details at the mesoscale is used to describe the failure process at the scale of the heterogeneities. This model is used to compute histograms of the relative distances between damaged points. These numerical results are compared with experimental data, where the damage evolution is monitored using acoustic emissions. Histograms of the relative distances between damage events in the numerical calculations and acoustic events in the experiments exhibit good agreement. It is shown that the mesoscale model provides relevant information from the point of view of both global responses and the local failure process. © 2015 The Authors. International Journal for Numerical and Analytical Methods in Geomechanics published by John Wiley & Sons Ltd.

  6. Global Situational Awareness with Free Tools

    DTIC Science & Technology

    2015-01-15

    Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language

  7. A Model of Small Group Facilitator Competencies

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon

    2008-01-01

    This study used small group theory, quantitative and qualitative data collected from experienced practicing facilitators at three points of time, and a building block process of collection, analysis, further collection, and consolidation to develop a model of small group facilitator competencies. The proposed model has five components:…

  8. A method for analyzing temporal patterns of variability of a time series from Poincare plots.

    PubMed

    Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E

    2012-07-01

    The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.

  9. A novel mesh processing based technique for 3D plant analysis

    PubMed Central

    2012-01-01

    Background In recent years, imaging based, automated, non-invasive, and non-destructive high-throughput plant phenotyping platforms have become popular tools for plant biology, underpinning the field of plant phenomics. Such platforms acquire and record large amounts of raw data that must be accurately and robustly calibrated, reconstructed, and analysed, requiring the development of sophisticated image understanding and quantification algorithms. The raw data can be processed in different ways, and the past few years have seen the emergence of two main approaches: 2D image processing and 3D mesh processing algorithms. Direct image quantification methods (usually 2D) dominate the current literature due to comparative simplicity. However, 3D mesh analysis provides the tremendous potential to accurately estimate specific morphological features cross-sectionally and monitor them over-time. Result In this paper, we present a novel 3D mesh based technique developed for temporal high-throughput plant phenomics and perform initial tests for the analysis of Gossypium hirsutum vegetative growth. Based on plant meshes previously reconstructed from multi-view images, the methodology involves several stages, including morphological mesh segmentation, phenotypic parameters estimation, and plant organs tracking over time. The initial study focuses on presenting and validating the accuracy of the methodology on dicotyledons such as cotton but we believe the approach will be more broadly applicable. This study involved applying our technique to a set of six Gossypium hirsutum (cotton) plants studied over four time-points. Manual measurements, performed for each plant at every time-point, were used to assess the accuracy of our pipeline and quantify the error on the morphological parameters estimated. Conclusion By directly comparing our automated mesh based quantitative data with manual measurements of individual stem height, leaf width and leaf length, we obtained the mean absolute errors of 9.34%, 5.75%, 8.78%, and correlation coefficients 0.88, 0.96, and 0.95 respectively. The temporal matching of leaves was accurate in 95% of the cases and the average execution time required to analyse a plant over four time-points was 4.9 minutes. The mesh processing based methodology is thus considered suitable for quantitative 4D monitoring of plant phenotypic features. PMID:22553969

  10. Memory persistency and nonlinearity in daily mean dew point across India

    NASA Astrophysics Data System (ADS)

    Ray, Rajdeep; Khondekar, Mofazzal Hossain; Ghosh, Koushik; Bhattacharjee, Anup Kumar

    2016-04-01

    Enterprising endeavour has been taken in this work to realize and estimate the persistence in memory of the daily mean dew point time series obtained from seven different weather stations viz. Kolkata, Chennai (Madras), New Delhi, Mumbai (Bombay), Bhopal, Agartala and Ahmedabad representing different geographical zones in India. Hurst exponent values reveal an anti-persistent behaviour of these dew point series. To affirm the Hurst exponent values, five different scaling methods have been used and the corresponding results are compared to synthesize a finer and reliable conclusion out of it. The present analysis also bespeaks that the variation in daily mean dew point is governed by a non-stationary process with stationary increments. The delay vector variance (DVV) method has been exploited to investigate nonlinearity, and the present calculation confirms the presence of deterministic nonlinear profile in the daily mean dew point time series of the seven stations.

  11. Validation of material point method for soil fluidisation analysis

    NASA Astrophysics Data System (ADS)

    Bolognin, Marco; Martinelli, Mario; Bakker, Klaas J.; Jonkman, Sebastiaan N.

    2017-06-01

    The main aim of this paper is to describe and analyse the modelling of vertical column tests that undergo fluidisation by the application of a hydraulic gradient. A recent advancement of the material point method (MPM), allows studying both stationary and non-stationary fluid flow while interacting with the solid phase. The fluidisation initiation and post-fluidisation processes of the soil will be investigated with an advanced MPM formulation (Double Point) in which the behavior of the solid and the liquid phase is evaluated separately, assigning to each of them a set of material points (MPs). The result of these simulations are compared to analytic solutions and measurements from laboratory experiments. This work is used as a benchmark test for the MPM double point formulation in the Anura3D software and to verify the feasibility of the software for possible future engineering applications.

  12. Earth Observatory Satellite system definition study. Report no. 7: EOS system definition report. Appendixes A through D

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis of the systems involved in the operation and support of the Earth Observatory Satellite (EOS) is presented. Among the systems considered are the following: (1) the data management system, (2) observatory to primary ground station communications links, (3) local user system, (4) techniques for recognizing ground control points, (5) the central data processing-implementation concept, and (6) program effectiveness analysis.

  13. AN OPTIMIZED 64X64 POINT TWO-DIMENSIONAL FAST FOURIER TRANSFORM

    NASA Technical Reports Server (NTRS)

    Miko, J.

    1994-01-01

    Scientists at Goddard have developed an efficient and powerful program-- An Optimized 64x64 Point Two-Dimensional Fast Fourier Transform-- which combines the performance of real and complex valued one-dimensional Fast Fourier Transforms (FFT's) to execute a two-dimensional FFT and its power spectrum coefficients. These coefficients can be used in many applications, including spectrum analysis, convolution, digital filtering, image processing, and data compression. The program's efficiency results from its technique of expanding all arithmetic operations within one 64-point FFT; its high processing rate results from its operation on a high-speed digital signal processor. For non-real-time analysis, the program requires as input an ASCII data file of 64x64 (4096) real valued data points. As output, this analysis produces an ASCII data file of 64x64 power spectrum coefficients. To generate these coefficients, the program employs a row-column decomposition technique. First, it performs a radix-4 one-dimensional FFT on each row of input, producing complex valued results. Then, it performs a one-dimensional FFT on each column of these results to produce complex valued two-dimensional FFT results. Finally, the program sums the squares of the real and imaginary values to generate the power spectrum coefficients. The program requires a Banshee accelerator board with 128K bytes of memory from Atlanta Signal Processors (404/892-7265) installed on an IBM PC/AT compatible computer (DOS ver. 3.0 or higher) with at least one 16-bit expansion slot. For real-time operation, an ASPI daughter board is also needed. The real-time configuration reads 16-bit integer input data directly into the accelerator board, operating on 64x64 point frames of data. The program's memory management also allows accumulation of the coefficient results. The real-time processing rate to calculate and accumulate the 64x64 power spectrum output coefficients is less than 17.0 mSec. Documentation is included in the price of the program. Source code is written in C, 8086 Assembly, and Texas Instruments TMS320C30 Assembly Languages. This program is available on a 5.25 inch 360K MS-DOS format diskette. IBM and IBM PC are registered trademarks of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.

  14. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  15. Using a Remotely Piloted Aircraft System (RPAS) to analyze the stability of a natural rock slope

    NASA Astrophysics Data System (ADS)

    Salvini, Riccardo; Esposito, Giuseppe; Mastrorocco, Giovanni; Seddaiu, Marcello

    2016-04-01

    This paper describes the application of a rotary wing RPAS for monitoring the stability of a natural rock slope in the municipality of Vecchiano (Pisa, Italy). The slope under investigation is approximately oriented NNW-SSE and has a length of about 320 m; elevation ranges from about 7 to 80 m a.s.l.. The hill consists of stratified limestone, somewhere densely fractured, with dip direction predominantly oriented in a normal way respect to the slope. Fracture traces are present in variable lengths, from decimetre to metre, and penetrate inward the rock versant with thickness difficult to estimate, often exceeding one meter in depth. The intersection between different fracture systems and the slope surface generates rocky blocks and wedges of variable size that may be subject to phenomena of gravitational instability (with reference to the variation of hydraulic and dynamic conditions). Geometrical and structural info about the rock mass, necessary to perform the analysis of the slope stability, were obtained in this work from geo-referenced 3D point clouds acquired using photogrammetric and laser scanning techniques. In particular, a terrestrial laser scanning was carried out from two different point of view using a Leica Scanstation2. The laser survey created many shadows in the data due to the presence of vegetation in the lower parts of the slope and limiting the feasibility of geo-structural survey. To overcome such a limitation, we utilized a rotary wing Aibotix Aibot X6 RPAS geared with a Nikon D3200 camera. The drone flights were executed in manual modality and the images were acquired, according to the characteristics of the outcrops, under different acquisition angles. Furthermore, photos were captured very close to the versant (a few meters), allowing to produce a dense 3D point cloud (about 80 Ma points) by the image processing. A topographic survey was carried out in order to guarantee the necessary spatial accuracy to the process of images exterior orientation. The coordinates of GCPs were calculated through the post-processing of data collected by using two GPS receivers, operating in static modality, and a Total Station. The photogrammetric processing of image blocks allowed us to create the 3D point cloud, DTM, orthophoto, and 3D textured model with high level of cartographic detail. Discontinuities were deterministically characterized in terms of attitude, persistence, and spacing. Moreover, the main discontinuity sets were identified through a density analysis of attitudes in stereographic projection. In addition, the size and shape of potentially unstable blocks identified along the rock slope were measured. Finally, using additional data from traditional engineering-geological surveys executed in accessible outcrops, the kinematic and dynamic stability analysis of the rocky slope was performed. Results from this step have indicated the deterministic safety factors of rock blocks and wedges, and will be used by local Authorities to plan the protection works for safety guarantee. Results from this application show the great advantage of modern RPAS that can be successfully applied for the analysis of sub-vertical rocky slopes, especially in areas either difficult to access with traditional techniques or masked by the presence of vegetation. KEY WORDS: 3D point cloud, RPAS photogrammetry, Terrestrial laser scanning, Rock slope, Fracture mapping, Stability analysis

  16. A new method for mapping multidimensional data to lower dimensions

    NASA Technical Reports Server (NTRS)

    Gowda, K. C.

    1983-01-01

    A multispectral mapping method is proposed which is based on the new concept of BEND (Bidimensional Effective Normalised Difference). The method, which involves taking one sample point at a time and finding the interrelationships between its features, is found very economical from the point of view of storage and processing time. It has good dimensionality reduction and clustering properties, and is highly suitable for computer analysis of large amounts of data. The transformed values obtained by this procedure are suitable for either a planar 2-space mapping of geological sample points or for making grayscale and color images of geo-terrains. A few examples are given to justify the efficacy of the proposed procedure.

  17. Seeking a fingerprint: analysis of point processes in actigraphy recording

    NASA Astrophysics Data System (ADS)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  18. A triangular thin shell finite element: Nonlinear analysis. [structural analysis

    NASA Technical Reports Server (NTRS)

    Thomas, G. R.; Gallagher, R. H.

    1975-01-01

    Aspects of the formulation of a triangular thin shell finite element which pertain to geometrically nonlinear (small strain, finite displacement) behavior are described. The procedure for solution of the resulting nonlinear algebraic equations combines a one-step incremental (tangent stiffness) approach with one iteration in the Newton-Raphson mode. A method is presented which permits a rational estimation of step size in this procedure. Limit points are calculated by means of a superposition scheme coupled to the incremental side of the solution procedure while bifurcation points are calculated through a process of interpolation of the determinants of the tangent-stiffness matrix. Numerical results are obtained for a flat plate and two curved shell problems and are compared with alternative solutions.

  19. Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover

    NASA Technical Reports Server (NTRS)

    Dangelo, K. R.

    1974-01-01

    A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.

  20. Fractal Point Process and Queueing Theory and Application to Communication Networks

    DTIC Science & Technology

    1999-12-31

    use of nonlinear dynamics and chaos in the design of innovative analog error-protection codes for com- munications applications. In the chaos...the fol- lowing theses, patent, and papers. 1. A. Narula, M. D. Trott , and G. W. Wornell, "Information-Theoretic Analysis of Multiple-Antenna...Bounds," in Proc. Int. Conf. Dec. Control, (Japan), Dec. 1996. 5. G. W. Wornell and M. D. Trott , "Efficient Signal Processing Tech- niques for

  1. Safe driving and executive functions in healthy middle-aged drivers.

    PubMed

    León-Domínguez, Umberto; Solís-Marcos, Ignacio; Barrio-Álvarez, Elena; Barroso Y Martín, Juan Manuel; León-Carrión, José

    2017-01-01

    The introduction of the point system driver's license in several European countries could offer a valid framework for evaluating driving skills. This is the first study to use this framework to assess the functional integrity of executive functions in middle-aged drivers with full points, partial points or no points on their driver's license (N = 270). The purpose of this study is to find differences in executive functions that could be determinants in safe driving. Cognitive tests were used to assess attention processes, processing speed, planning, cognitive flexibility, and inhibitory control. Analyses for covariance (ANCOVAS) were used for group comparisons while adjusting for education level. The Bonferroni method was used for correcting for multiple comparisons. Overall, drivers with the full points on their license showed better scores than the other two groups. In particular, significant differences were found in reaction times on Simple and Conditioned Attention tasks (both p-values < 0.001) and in number of type-III errors on the Tower of Hanoi task (p = 0.026). Differences in reaction time on attention tasks could serve as neuropsychological markers for safe driving. Further analysis should be conducted in order to determine the behavioral impact of impaired executive functioning on driving ability.

  2. Film characteristics pertinent to coherent optical data processing systems.

    PubMed

    Thomas, C E

    1972-08-01

    Photographic film is studied quantitatively as the input mechanism for coherent optical data recording and processing systems. The two important film characteristics are the amplitude transmission vs exposure (T(A) - E) curve and the film noise power spectral density. Both functions are measured as a function of the type of film, the type of developer, developer time and temperature, and the exposing and readout light wavelengths. A detailed analysis of a coherent optical spatial frequency analyzer reveals that the optimum do bias point for 649-F film is an amplitude transmission of about 70%. This operating point yields minimum harmonic and intermodulation distortion, whereas the 50% amplitude transmission bias point recommended by holographers yields maximum diffraction efficiency. It is also shown that the effective ac gain or contrast of the film is nearly independent of the development conditions for a given film. Finally, the linear dynamic range of one particular coherent optical spatial frequency analyzer is shown to be about 40-50 dB.

  3. A Machine Learning Approach to Pedestrian Detection for Autonomous Vehicles Using High-Definition 3D Range Data

    PubMed Central

    Navarro, Pedro J.; Fernández, Carlos; Borraz, Raúl; Alonso, Diego

    2016-01-01

    This article describes an automated sensor-based system to detect pedestrians in an autonomous vehicle application. Although the vehicle is equipped with a broad set of sensors, the article focuses on the processing of the information generated by a Velodyne HDL-64E LIDAR sensor. The cloud of points generated by the sensor (more than 1 million points per revolution) is processed to detect pedestrians, by selecting cubic shapes and applying machine vision and machine learning algorithms to the XY, XZ, and YZ projections of the points contained in the cube. The work relates an exhaustive analysis of the performance of three different machine learning algorithms: k-Nearest Neighbours (kNN), Naïve Bayes classifier (NBC), and Support Vector Machine (SVM). These algorithms have been trained with 1931 samples. The final performance of the method, measured a real traffic scenery, which contained 16 pedestrians and 469 samples of non-pedestrians, shows sensitivity (81.2%), accuracy (96.2%) and specificity (96.8%). PMID:28025565

  4. Message survival and decision dynamics in a class of reactive complex systems subject to external fields

    NASA Astrophysics Data System (ADS)

    Rodriguez Lucatero, C.; Schaum, A.; Alarcon Ramos, L.; Bernal-Jaquez, R.

    2014-07-01

    In this study, the dynamics of decisions in complex networks subject to external fields are studied within a Markov process framework using nonlinear dynamical systems theory. A mathematical discrete-time model is derived using a set of basic assumptions regarding the convincement mechanisms associated with two competing opinions. The model is analyzed with respect to the multiplicity of critical points and the stability of extinction states. Sufficient conditions for extinction are derived in terms of the convincement probabilities and the maximum eigenvalues of the associated connectivity matrices. The influences of exogenous (e.g., mass media-based) effects on decision behavior are analyzed qualitatively. The current analysis predicts: (i) the presence of fixed-point multiplicity (with a maximum number of four different fixed points), multi-stability, and sensitivity with respect to the process parameters; and (ii) the bounded but significant impact of exogenous perturbations on the decision behavior. These predictions were verified using a set of numerical simulations based on a scale-free network topology.

  5. Obesity Energetics: Body Weight Regulation and the Effects of Diet Composition

    PubMed Central

    Hall, Kevin D.; Guo, Juen

    2017-01-01

    Weight changes are accompanied by imbalances between calorie intake and expenditure. This fact is often misinterpreted to suggest that obesity is caused by gluttony and sloth and can be treated by simply advising people to eat less and move more. However, various components of energy balance are dynamically interrelated and weight loss is resisted by counterbalancing physiological processes. While low-carbohydrate diets have been suggested to partially subvert these processes by increasing energy expenditure and promoting fat loss, our meta-analysis of 32 controlled feeding studies with isocaloric substitution of carbohydrate for fat found that both energy expenditure (26 kcal/d; P <.0001) and fat loss (16 g/d; P <.0001) were greater with lower fat diets. We review the components of energy balance and the mechanisms acting to resist weight loss in the context of static, settling point, and set-point models of body weight regulation, with the set-point model being most commensurate with current data. PMID:28193517

  6. A Machine Learning Approach to Pedestrian Detection for Autonomous Vehicles Using High-Definition 3D Range Data.

    PubMed

    Navarro, Pedro J; Fernández, Carlos; Borraz, Raúl; Alonso, Diego

    2016-12-23

    This article describes an automated sensor-based system to detect pedestrians in an autonomous vehicle application. Although the vehicle is equipped with a broad set of sensors, the article focuses on the processing of the information generated by a Velodyne HDL-64E LIDAR sensor. The cloud of points generated by the sensor (more than 1 million points per revolution) is processed to detect pedestrians, by selecting cubic shapes and applying machine vision and machine learning algorithms to the XY, XZ, and YZ projections of the points contained in the cube. The work relates an exhaustive analysis of the performance of three different machine learning algorithms: k-Nearest Neighbours (kNN), Naïve Bayes classifier (NBC), and Support Vector Machine (SVM). These algorithms have been trained with 1931 samples. The final performance of the method, measured a real traffic scenery, which contained 16 pedestrians and 469 samples of non-pedestrians, shows sensitivity (81.2%), accuracy (96.2%) and specificity (96.8%).

  7. a Comparitive Study Using Geometric and Vertical Profile Features Derived from Airborne LIDAR for Classifying Tree Genera

    NASA Astrophysics Data System (ADS)

    Ko, C.; Sohn, G.; Remmel, T. K.

    2012-07-01

    We present a comparative study between two different approaches for tree genera classification using descriptors derived from tree geometry and those derived from the vertical profile analysis of LiDAR point data. The different methods provide two perspectives for processing LiDAR point clouds for tree genera identification. The geometric perspective analyzes individual tree crowns in relation to valuable information related to characteristics of clusters and line segments derived within crowns and overall tree shapes to highlight the spatial distribution of LiDAR points within the crown. Conversely, analyzing vertical profiles retrieves information about the point distributions with respect to height percentiles; this perspective emphasizes of the importance that point distributions at specific heights express, accommodating for the decreased point density with respect to depth of canopy penetration by LiDAR pulses. The targeted species include white birch, maple, oak, poplar, white pine and jack pine at a study site northeast of Sault Ste. Marie, Ontario, Canada.

  8. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  9. Moving Average Models with Bivariate Exponential and Geometric Distributions.

    DTIC Science & Technology

    1985-03-01

    ordinary time series and of point processes. Developments in Statistics, Vol. 1, P.R. Krishnaiah , ed. Academic Press, New York. [9] Esary, J.D. and...valued and discrete - valued time series with ARMA correlation structure. Multivariate Analysis V, P.R. Krishnaiah , ed. North-Holland. 151-166. [28

  10. 40 CFR 406.11 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Corn Wet Milling Subcategory § 406.11 Specialized definitions... and methods of analysis set forth in 40 CFR part 401 shall apply to this subpart. (b) The term corn shall mean the shelled corn delivered to a plant before processing. (c) The term standard bushel shall...

  11. 78 FR 72626 - Notice of Request for Renewal of a Currently Approved Information Collection (Pathogen Reduction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... reduction and Hazard Analysis and Critical Control Point (HACCP) Systems requirements because OMB approval... February 28, 2014. FSIS has established requirements applicable to meat and poultry establishments designed.... coli by slaughter establishments to verify the adequacy of the establishment's process controls for the...

  12. Building Regression Models: The Importance of Graphics.

    ERIC Educational Resources Information Center

    Dunn, Richard

    1989-01-01

    Points out reasons for using graphical methods to teach simple and multiple regression analysis. Argues that a graphically oriented approach has considerable pedagogic advantages in the exposition of simple and multiple regression. Shows that graphical methods may play a central role in the process of building regression models. (Author/LS)

  13. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-1427... Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is...

  14. Analysis of I Marine Expeditionary Force Support Team Reset Operations

    DTIC Science & Technology

    2013-06-01

    in the process. For the hands that have assisted in providing information and that have truly been a catalyst and a crutch for the completion...simulation-Marine Corps. Point paper . Retrieved February 20, 2013 from http://www.ehqmc.usmc.mil/org/IL/ Burton, L.D. (2005). Strategic Inventory

  15. Improving the Analysis of Anthocyanidins from Blueberries Using Response Surface Methodology

    USDA-ARS?s Scientific Manuscript database

    Background: Recent interest in the health promoting potential of anthocyanins points to the need for robust and reliable analytical methods. It is essential to know that the health promoting chemicals are present in juices and other products processed from whole fruit. Many different methods have be...

  16. A Behavioral Analysis of the Laboratory Learning Process: Redesigning a Teaching Unit on Recrystallization.

    ERIC Educational Resources Information Center

    Mulder, T.; Verdonk, A. H.

    1984-01-01

    Reports on a project in which observations of student and teaching assistant behavior were used to redesign a teaching unit on recrystallization. Comments on the instruction manual, starting points for teaching the unit, and list of objectives with related tasks are included. (JN)

  17. Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods

    PubMed Central

    Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.

    2017-01-01

    The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537

  18. Analysis of energy recovery potential using innovative technologies of waste gasification.

    PubMed

    Lombardi, Lidia; Carnevale, Ennio; Corti, Andrea

    2012-04-01

    In this paper, two alternative thermo-chemical processes for waste treatment were analysed: high temperature gasification and gasification associated to plasma process. The two processes were analysed from the thermodynamic point of view, trying to reconstruct two simplified models, using appropriate simulation tools and some support data from existing/planned plants, able to predict the energy recovery performances by process application. In order to carry out a comparative analysis, the same waste stream input was considered as input to the two models and the generated results were compared. The performances were compared with those that can be obtained from conventional combustion with energy recovery process by means of steam turbine cycle. Results are reported in terms of energy recovery performance indicators as overall energy efficiency, specific energy production per unit of mass of entering waste, primary energy source savings, specific carbon dioxide production. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Development and evaluation of spatial point process models for epidermal nerve fibers.

    PubMed

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Extension of the tridiagonal reduction (FEER) method for complex eigenvalue problems in NASTRAN

    NASA Technical Reports Server (NTRS)

    Newman, M.; Mann, F. I.

    1978-01-01

    As in the case of real eigenvalue analysis, the eigensolutions closest to a selected point in the eigenspectrum were extracted from a reduced, symmetric, tridiagonal eigenmatrix whose order was much lower than that of the full size problem. The reduction process was effected automatically, and thus avoided the arbitrary lumping of masses and other physical quantities at selected grid points. The statement of the algebraic eigenvalue problem admitted mass, damping, and stiffness matrices which were unrestricted in character, i.e., they might be real, symmetric or nonsymmetric, singular or nonsingular.

  1. A generalized adaptive mathematical morphological filter for LIDAR data

    NASA Astrophysics Data System (ADS)

    Cui, Zheng

    Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.

  2. Multi-channel Analysis of Passive Surface Waves (MAPS)

    NASA Astrophysics Data System (ADS)

    Xia, J.; Cheng, F. Mr; Xu, Z.; Wang, L.; Shen, C.; Liu, R.; Pan, Y.; Mi, B.; Hu, Y.

    2017-12-01

    Urbanization is an inevitable trend in modernization of human society. In the end of 2013 the Chinese Central Government launched a national urbanization plan—"Three 100 Million People", which aggressively and steadily pushes forward urbanization. Based on the plan, by 2020, approximately 100 million people from rural areas will permanently settle in towns, dwelling conditions of about 100 million people in towns and villages will be improved, and about 100 million people in the central and western China will permanently settle in towns. China's urbanization process will run at the highest speed in the urbanization history of China. Environmentally friendly, non-destructive and non-invasive geophysical assessment method has played an important role in the urbanization process in China. Because human noise and electromagnetic field due to industrial life, geophysical methods already used in urban environments (gravity, magnetics, electricity, seismic) face great challenges. But humanity activity provides an effective source of passive seismic methods. Claerbout pointed out that wavefileds that are received at one point with excitation at the other point can be reconstructed by calculating the cross-correlation of noise records at two surface points. Based on this idea (cross-correlation of two noise records) and the virtual source method, we proposed Multi-channel Analysis of Passive Surface Waves (MAPS). MAPS mainly uses traffic noise recorded with a linear receiver array. Because Multi-channel Analysis of Surface Waves can produces a shear (S) wave velocity model with high resolution in shallow part of the model, MPAS combines acquisition and processing of active source and passive source data in a same flow, which does not require to distinguish them. MAPS is also of ability of real-time quality control of noise recording that is important for near-surface applications in urban environment. The numerical and real-world examples demonstrated that MAPS can be used for accurate and fast imaging of high-frequency surface wave energy, and some examples also show that high quality imaging similar to those with active sources can be generated only by the use of a few minutes of noise. The use of cultural noise in town, MAPS can image S-wave velocity structure from the ground surface to hundreds of meters depth.

  3. A Case Study on Improving Intensive Care Unit (ICU) Services Reliability: By Using Process Failure Mode and Effects Analysis (PFMEA)

    PubMed Central

    Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad

    2016-01-01

    Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162

  4. A graph signal filtering-based approach for detection of different edge types on airborne lidar data

    NASA Astrophysics Data System (ADS)

    Bayram, Eda; Vural, Elif; Alatan, Aydin

    2017-10-01

    Airborne Laser Scanning is a well-known remote sensing technology, which provides a dense and highly accurate, yet unorganized point cloud of earth surface. During the last decade, extracting information from the data generated by airborne LiDAR systems has been addressed by many studies in geo-spatial analysis and urban monitoring applications. However, the processing of LiDAR point clouds is challenging due to their irregular structure and 3D geometry. In this study, we propose a novel framework for the detection of the boundaries of an object or scene captured by LiDAR. Our approach is motivated by edge detection techniques in vision research and it is established on graph signal filtering which is an exciting and promising field of signal processing for irregular data types. Due to the convenient applicability of graph signal processing tools on unstructured point clouds, we achieve the detection of the edge points directly on 3D data by using a graph representation that is constructed exclusively to answer the requirements of the application. Moreover, considering the elevation data as the (graph) signal, we leverage aerial characteristic of the airborne LiDAR data. The proposed method can be employed both for discovering the jump edges on a segmentation problem and for exploring the crease edges on a LiDAR object on a reconstruction/modeling problem, by only adjusting the filter characteristics.

  5. Analysis of thermal processing of table olives using computational fluid dynamics.

    PubMed

    Dimou, A; Panagou, E; Stoforos, N G; Yanniotis, S

    2013-11-01

    In the present work, the thermal processing of table olives in brine in a stationary metal can was studied through computational fluid dynamics (CFD). The flow patterns of the brine and the temperature evolution in the olives and brine during the heating and the cooling cycles of the process were calculated using the CFD code. Experimental temperature measurements at 3 points (2 inside model olive particles and 1 at a point in the brine) in a can (with dimensions of 75 mm × 105 mm) filled with 48 olives in 4% (w/v) brine, initially held at 20 °C, heated in water at 100 °C for 10 min, and thereafter cooled in water at about 20 °C for 10 min, validated model predictions. The distribution of temperature and F-values and the location of the slowest heating zone and the critical point within the product, as far as microbial destruction is concerned, were assessed for several cases. For the cases studied, the critical point was located at the interior of the olives at the 2nd, or between the 1st and the 2nd olive row from the bottom of the container, the exact location being affected by olive size, olive arrangement, and geometry of the container. © 2013 Institute of Food Technologists®

  6. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

  7. Standardization of Shodhita Naga with special reference to thermogravimetry and infra-red spectroscopy.

    PubMed

    Rajput, Dhirajsingh S; Patgiri, Biswajyoti; Shukla, Vinay J

    2014-01-01

    Standardization of Ayurvedic medicine is the need of hour to obtain desired quality of final product. Shodhana literally means purification, is the initial step to make drugs like metals, minerals and poisonous herbs suitable for further procedure. Shodhana of metals/minerals help to expose maximum surface area of drug for chemical reactions and also in impregnation of organic materials and their properties in the drug. Thermo-gravimetric analysis (TGA) facilitates in identifying the presence of organic matter and change in the melting point of metal whereas Fourier transform infra-red spectroscopy (FTIR) assists in identifying the presence of various functional groups. To standardize the process of Naga Shodhana and to study the change in chemical nature of Shodhita Naga in each media through TGA and FTIR. Samanya and Vishesha Shodhana of Naga was carried out. Time taken for melting of Naga, physico-chemical changes in media used for Shodhana and weight changes after Shodhana were recorded. Samples of Naga were collected after Shodhana in each media for TGA and FTIR analysis. Average loss occurred during Shodhana was 6.26%. Melting point of Ashuddha Naga was 327.46°C, and it was 328.42°C after Shodhana. Percentage purity of Naga (percentage of lead in Naga) decreased after Shodhana from 99.80% to 99.40%. FTIR analysis of Shodhita Naga in each sample showed stretching vibrations particularly between C-H and C-N bonds that are indicating the presence of various organic compounds. According to TGA and FTIR analysis, Shodhana process increases melting point of Naga and initiation of new physico-chemical properties which are indicated by detection of large number of functional groups and organo-metallic nature of Shodhita Naga.

  8. Standardization of Shodhita Naga with special reference to thermogravimetry and infra-red spectroscopy

    PubMed Central

    Rajput, Dhirajsingh S.; Patgiri, Biswajyoti; Shukla, Vinay J.

    2014-01-01

    Background: Standardization of Ayurvedic medicine is the need of hour to obtain desired quality of final product. Shodhana literally means purification, is the initial step to make drugs like metals, minerals and poisonous herbs suitable for further procedure. Shodhana of metals/minerals help to expose maximum surface area of drug for chemical reactions and also in impregnation of organic materials and their properties in the drug. Thermo-gravimetric analysis (TGA) facilitates in identifying the presence of organic matter and change in the melting point of metal whereas Fourier transform infra-red spectroscopy (FTIR) assists in identifying the presence of various functional groups. Aim: To standardize the process of Naga Shodhana and to study the change in chemical nature of Shodhita Naga in each media through TGA and FTIR. Material and Methods: Samanya and Vishesha Shodhana of Naga was carried out. Time taken for melting of Naga, physico-chemical changes in media used for Shodhana and weight changes after Shodhana were recorded. Samples of Naga were collected after Shodhana in each media for TGA and FTIR analysis. Results: Average loss occurred during Shodhana was 6.26%. Melting point of Ashuddha Naga was 327.46°C, and it was 328.42°C after Shodhana. Percentage purity of Naga (percentage of lead in Naga) decreased after Shodhana from 99.80% to 99.40%. FTIR analysis of Shodhita Naga in each sample showed stretching vibrations particularly between C-H and C-N bonds that are indicating the presence of various organic compounds. Conclusion: According to TGA and FTIR analysis, Shodhana process increases melting point of Naga and initiation of new physico-chemical properties which are indicated by detection of large number of functional groups and organo-metallic nature of Shodhita Naga. PMID:26664241

  9. Position and volume estimation of atmospheric nuclear detonations from video reconstruction

    NASA Astrophysics Data System (ADS)

    Schmitt, Daniel T.

    Recent work in digitizing films of foundational atmospheric nuclear detonations from the 1950s provides an opportunity to perform deeper analysis on these historical tests. This work leverages multi-view geometry and computer vision techniques to provide an automated means to perform three-dimensional analysis of the blasts for several points in time. The accomplishment of this requires careful alignment of the films in time, detection of features in the images, matching of features, and multi-view reconstruction. Sub-explosion features can be detected with a 67% hit rate and 22% false alarm rate. Hotspot features can be detected with a 71.95% hit rate, 86.03% precision and a 0.015% false positive rate. Detected hotspots are matched across 57-109 degree viewpoints with 76.63% average correct matching by defining their location relative to the center of the explosion, rotating them to the alternative viewpoint, and matching them collectively. When 3D reconstruction is applied to the hotspot matching it completes an automated process that has been used to create 168 3D point clouds with 31.6 points per reconstruction with each point having an accuracy of 0.62 meters with 0.35, 0.24, and 0.34 meters of accuracy in the x-, y- and z-direction respectively. As a demonstration of using the point clouds for analysis, volumes are estimated and shown to be consistent with radius-based models and in some cases improve on the level of uncertainty in the yield calculation.

  10. Conversion of paper sludge to ethanol, II: process design and economic analysis.

    PubMed

    Fan, Zhiliang; Lynd, Lee R

    2007-01-01

    Process design and economics are considered for conversion of paper sludge to ethanol. A particular site, a bleached kraft mill operated in Gorham, NH by Fraser Papers (15 tons dry sludge processed per day), is considered. In addition, profitability is examined for a larger plant (50 dry tons per day) and sensitivity analysis is carried out with respect to capacity, tipping fee, and ethanol price. Conversion based on simultaneous saccharification and fermentation with intermittent feeding is examined, with ethanol recovery provided by distillation and molecular sieve adsorption. It was found that the Fraser plant achieves positive cash flow with or without xylose conversion and mineral recovery. Sensitivity analysis indicates economics are very sensitive to ethanol selling price and scale; significant but less sensitive to the tipping fee, and rather insensitive to the prices of cellulase and power. Internal rates of return exceeding 15% are projected for larger plants at most combinations of scale, tipping fee, and ethanol price. Our analysis lends support to the proposition that paper sludge is a leading point-of-entry and proving ground for emergent industrial processes featuring enzymatic hydrolysis of cellulosic biomass.

  11. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  12. [Psychoanalysis and Psychiatrie-Enquete: expert interviews and document analysis].

    PubMed

    Söhner, Felicitas Petra; Fangerau, Heiner; Becker, Thomas

    2017-12-01

    Background The purpose of this paper is to analyse the perception of the role of psychoanalysis and psychoanalysts in the coming about of the Psychiatrie-Enquete in the Federal Republic of Germany (West Germany). Methods We performed a qualitative content analysis of expert interviews with persons involved in the Enquete (or witnessing the events as mental health professionals active at the time), a selective literature review and an analysis of documents on the Enquete process. Results Expert interviews, relevant literature and documents point to a role of psychoanalysis in the Enquete process. Psychoanalysts were considered to have been effective in the run-up to the Enquete and the work of the commission. Conclusion Psychoanalysis and a small number of psychoanalysts were perceived as being relevant in the overall process of the Psychiatrie-Enquete in West Germany. Georg Thieme Verlag KG Stuttgart · New York.

  13. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  14. Using Pattern Recognition and Discriminance Analysis to Predict Critical Events in Large Signal Databases

    NASA Astrophysics Data System (ADS)

    Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang

    2009-09-01

    Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.

  15. Automated Analysis of Renewable Energy Datasets ('EE/RE Data Mining')

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian; Elmore, Ryan; Getman, Dan

    This poster illustrates methods to substantially improve the understanding of renewable energy data sets and the depth and efficiency of their analysis through the application of statistical learning methods ('data mining') in the intelligent processing of these often large and messy information sources. The six examples apply methods for anomaly detection, data cleansing, and pattern mining to time-series data (measurements from metering points in buildings) and spatiotemporal data (renewable energy resource datasets).

  16. Promises of Coherence, Weak Content, and Strong Organization: An Analysis of the Student Texts (Reading-to-Write Report No. 3). Technical Report No. 22.

    ERIC Educational Resources Information Center

    Kantz, Margaret J.

    This study is the third in a series of reports of the Reading-to-Write Project, a collaborative study of students' cognitive processes at one critical point of entry into academic performance. This part of the study examines the problem that teachers have in judging whether textual signals that students use to indicate a persuasive analysis of…

  17. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  18. Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling.

    PubMed

    Zhang, Jiachao; Hu, Qisong; Xu, Chuanbiao; Liu, Sixin; Li, Congfa

    2016-01-01

    Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry.

  19. Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling

    PubMed Central

    Xu, Chuanbiao; Liu, Sixin; Li, Congfa

    2016-01-01

    Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry. PMID:27768750

  20. Extracting topographic structure from digital elevation data for geographic information-system analysis

    USGS Publications Warehouse

    Jenson, Susan K.; Domingue, Julia O.

    1988-01-01

    The first phase of analysis is a conditioning phase that generates three data sets: the original OEM with depressions filled, a data set indicating the flow direction for each cell, and a flow accumulation data set in which each cell receives a value equal to the total number of cells that drain to it. The original OEM and these three derivative data sets can then be processed in a variety of ways to optionally delineate drainage networks, overland paths, watersheds for userspecified locations, sub-watersheds for the major tributaries of a drainage network, or pour point linkages between watersheds. The computer-generated drainage lines and watershed polygons and the pour point linkage information can be transferred to vector-based geographic information systems for futher analysis. Comparisons between these computergenerated features and their manually delineated counterparts generally show close agreement, indicating that these software tools will save analyst time spent in manual interpretation and digitizing.

  1. Experimental and numerical study on optimization of the single point incremental forming of AINSI 304L stainless steel sheet

    NASA Astrophysics Data System (ADS)

    Saidi, B.; Giraud-Moreau, L.; Cherouat, A.; Nasri, R.

    2017-09-01

    AINSI 304L stainless steel sheets are commonly formed into a variety of shapes for applications in the industrial, architectural, transportation and automobile fields, it’s also used for manufacturing of denture base. In the field of dentistry, there is a need for personalized devises that are custom made for the patient. The single point incremental forming process is highly promising in this area for manufacturing of denture base. The single point incremental forming process (ISF) is an emerging process based on the use of a spherical tool, which is moved along CNC controlled tool path. One of the major advantages of this process is the ability to program several punch trajectories on the same machine in order to obtain different shapes. Several applications of this process exist in the medical field for the manufacturing of personalized titanium prosthesis (cranial plate, knee prosthesis...) due to the need of product customization to each patient. The objective of this paper is to study the incremental forming of AISI 304L stainless steel sheets for future applications in the dentistry field. During the incremental forming process, considerable forces can occur. The control of the forming force is particularly important to ensure the safe use of the CNC milling machine and preserve the tooling and machinery. In this paper, the effect of four different process parameters on the maximum force is studied. The proposed approach consists in using an experimental design based on experimental results. An analysis of variance was conducted with ANOVA to find the input parameters allowing to minimize the maximum forming force. A numerical simulation of the incremental forming process is performed with the optimal input process parameters. Numerical results are compared with the experimental ones.

  2. Detection of blunt, sharp force and gunshot lesions on burnt remains: a cautionary note.

    PubMed

    Poppa, Pasquale; Porta, Davide; Gibelli, Daniele; Mazzucchi, Alessandra; Brandone, Alberto; Grandi, Marco; Cattaneo, Cristina

    2011-09-01

    The study of skin and bone lesions may give information concerning type and manner of production, but in burnt material modification of tissues by the high temperatures may considerably change the morphological characteristics of the lesions. This study aims at pointing out the effects of burning head of pigs with several types of lesions (blunt trauma, sharp force, and gunshot lesions) on soft tissues and bones, both from a morphological and chemical point of view. Results show that the charring process does not completely destroy signs of lesions on bones, which can often be recovered by cleaning bone surface from charred soft-tissue residues. Furthermore, neutron activation analysis test proved that antimony may be detectable also on gunshot entry wounds at the final stages of charring process.

  3. Point-to-Point! Validation of the Small Aircraft Transportation System Higher Volume Operations Concept

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.

    2006-01-01

    Described is the research process that NASA researchers used to validate the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) concept. The four phase building-block validation and verification process included multiple elements ranging from formal analysis of HVO procedures to flight test, to full-system architecture prototype that was successfully shown to the public at the June 2005 SATS Technical Demonstration in Danville, VA. Presented are significant results of each of the four research phases that extend early results presented at ICAS 2004. HVO study results have been incorporated into the development of the Next Generation Air Transportation System (NGATS) vision and offer a validated concept to provide a significant portion of the 3X capacity improvement sought after in the United States National Airspace System (NAS).

  4. Point process statistics in atom probe tomography.

    PubMed

    Philippe, T; Duguay, S; Grancher, G; Blavette, D

    2013-09-01

    We present a review of spatial point processes as statistical models that we have designed for the analysis and treatment of atom probe tomography (APT) data. As a major advantage, these methods do not require sampling. The mean distance to nearest neighbour is an attractive approach to exhibit a non-random atomic distribution. A χ(2) test based on distance distributions to nearest neighbour has been developed to detect deviation from randomness. Best-fit methods based on first nearest neighbour distance (1 NN method) and pair correlation function are presented and compared to assess the chemical composition of tiny clusters. Delaunay tessellation for cluster selection has been also illustrated. These statistical tools have been applied to APT experiments on microelectronics materials. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  6. Effect of the Various Steps in the Processing of Human Milk in the Concentrations of IgA, IgM, and Lactoferrin.

    PubMed

    Arroyo, Gerardo; Ortiz Barrientos, Kevin Alexander; Lange, Karla; Nave, Federico; Miss Mas, Gabriela; Lam Aguilar, Pamela; Soto Galindo, Miguel Angel

    2017-09-01

    Human milk immune components are unique and important for the development of the newborn. Milk processing at the Human Milk Banks (HMB), however, causes partial destruction of immune proteins. The objective of this study was to determine the effects that heating during the milk processing procedure at the HMB had on the concentrations of IgA, IgM, and lactoferrin at three critical points in time. Fifty milk samples (150 mL) were collected from voluntary donors at the HMB at the Hospital Nacional Pedro de Bethancourt, located in Antigua Guatemala. Samples from three critical points in time during the milk processing procedure were selected for analysis: freezing/thawing I, freezing/thawing II, and pasteurization. IgA, IgM, and lactoferrin concentrations were determined during each critical point and compared with a baseline concentration. After milk processing, IgA, IgM, and lactoferrin mean concentrations were reduced by 30.0%, 36.0%, and 70.0%, respectively (p < 0.001). Reduction of biological activity was mainly attributed to pasteurization for IgA and lactoferrin (p < 0.001); the first freezing/thawing processes before pasteurization showed no significant reduction difference between mean concentrations of IgA (p = 0.160) and lactoferrin (p = 0.345) but showed a significant effect on IgM concentration (p = 0.016), and the second freezing/thawing procedure only showed a significant effect on IgA (p < 0.001). The effects of milk processing on the immune proteins that were evaluated in this study demonstrated a significant reduction.

  7. A Voxel-Based Approach for Imaging Voids in Three-Dimensional Point Clouds

    NASA Astrophysics Data System (ADS)

    Salvaggio, Katie N.

    Geographically accurate scene models have enormous potential beyond that of just simple visualizations in regard to automated scene generation. In recent years, thanks to ever increasing computational efficiencies, there has been significant growth in both the computer vision and photogrammetry communities pertaining to automatic scene reconstruction from multiple-view imagery. The result of these algorithms is a three-dimensional (3D) point cloud which can be used to derive a final model using surface reconstruction techniques. However, the fidelity of these point clouds has not been well studied, and voids often exist within the point cloud. Voids exist in texturally difficult areas, as well as areas where multiple views were not obtained during collection, constant occlusion existed due to collection angles or overlapping scene geometry, or in regions that failed to triangulate accurately. It may be possible to fill in small voids in the scene using surface reconstruction or hole-filling techniques, but this is not the case with larger more complex voids, and attempting to reconstruct them using only the knowledge of the incomplete point cloud is neither accurate nor aesthetically pleasing. A method is presented for identifying voids in point clouds by using a voxel-based approach to partition the 3D space. By using collection geometry and information derived from the point cloud, it is possible to detect unsampled voxels such that voids can be identified. This analysis takes into account the location of the camera and the 3D points themselves to capitalize on the idea of free space, such that voxels that lie on the ray between the camera and point are devoid of obstruction, as a clear line of sight is a necessary requirement for reconstruction. Using this approach, voxels are classified into three categories: occupied (contains points from the point cloud), free (rays from the camera to the point passed through the voxel), and unsampled (does not contain points and no rays passed through the area). Voids in the voxel space are manifested as unsampled voxels. A similar line-of-sight analysis can then be used to pinpoint locations at aircraft altitude at which the voids in the point clouds could theoretically be imaged. This work is based on the assumption that inclusion of more images of the void areas in the 3D reconstruction process will reduce the number of voids in the point cloud that were a result of lack of coverage. Voids resulting from texturally difficult areas will not benefit from more imagery in the reconstruction process, and thus are identified and removed prior to the determination of future potential imaging locations.

  8. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    NASA Astrophysics Data System (ADS)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  9. Reconstruction and analysis of hybrid composite shells using meshless methods

    NASA Astrophysics Data System (ADS)

    Bernardo, G. M. S.; Loja, M. A. R.

    2017-06-01

    The importance of focusing on the research of viable models to predict the behaviour of structures which may possess in some cases complex geometries is an issue that is growing in different scientific areas, ranging from the civil and mechanical engineering to the architecture or biomedical devices fields. In these cases, the research effort to find an efficient approach to fit laser scanning point clouds, to the desired surface, has been increasing, leading to the possibility of modelling as-built/as-is structures and components' features. However, combining the task of surface reconstruction and the implementation of a structural analysis model is not a trivial task. Although there are works focusing those different phases in separate, there is still an effective need to find approaches able to interconnect them in an efficient way. Therefore, achieving a representative geometric model able to be subsequently submitted to a structural analysis in a similar based platform is a fundamental step to establish an effective expeditious processing workflow. With the present work, one presents an integrated methodology based on the use of meshless approaches, to reconstruct shells described by points' clouds, and to subsequently predict their static behaviour. These methods are highly appropriate on dealing with unstructured points clouds, as they do not need to have any specific spatial or geometric requirement when implemented, depending only on the distance between the points. Details on the formulation, and a set of illustrative examples focusing the reconstruction of cylindrical and double-curvature shells, and its further analysis, are presented.

  10. Qumquad: a UML-based approach for remodeling of legacy systems in health care.

    PubMed

    Garde, Sebastian; Knaup, Petra; Herold, Ralf

    2003-07-01

    Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.

  11. Automatic detection of zebra crossings from mobile LiDAR data

    NASA Astrophysics Data System (ADS)

    Riveiro, B.; González-Jorge, H.; Martínez-Sánchez, J.; Díaz-Vilariño, L.; Arias, P.

    2015-07-01

    An algorithm for the automatic detection of zebra crossings from mobile LiDAR data is developed and tested to be applied for road management purposes. The algorithm consists of several subsequent processes starting with road segmentation by performing a curvature analysis for each laser cycle. Then, intensity images are created from the point cloud using rasterization techniques, in order to detect zebra crossing using the Standard Hough Transform and logical constrains. To optimize the results, image processing algorithms are applied to the intensity images from the point cloud. These algorithms include binarization to separate the painting area from the rest of the pavement, median filtering to avoid noisy points, and mathematical morphology to fill the gaps between the pixels in the border of white marks. Once the road marking is detected, its position is calculated. This information is valuable for inventorying purposes of road managers that use Geographic Information Systems. The performance of the algorithm has been evaluated over several mobile LiDAR strips accounting for a total of 30 zebra crossings. That test showed a completeness of 83%. Non-detected marks mainly come from painting deterioration of the zebra crossing or by occlusions in the point cloud produced by other vehicles on the road.

  12. Rheological of chocolate-flavored, reduced-calories coating as a function of conching process.

    PubMed

    Medina-Torres, Luis; Sanchez-Olivares, Guadalupe; Nuñez-Ramirez, Diola Marina; Moreno, Leonardo; Calderas, Fausto

    2014-07-01

    Continuous flow and linear viscoelasticity rheology of chocolate coating is studied in this work using fat substitute gums (xanthan, GX). An alternative conching process, using a Rotor-Estator (RE) type impeller, is proposed. The objective is to obtain a chocolate coating material with improved flow properties. Characterization of the final material through particle size distribution (PSD), differential scanning calorimetry (DSC) and proximal analysis is reported. Particle size distribution of the final material showed less polydispersity and therefore, greater homogeneity; fusion points were also generated at around 20 °C assuming crystal type I (β'2) and II (α). Moreover, the final material exhibited crossover points (higher structure material), whereas the commercial brand chocolate used for comparison did not. The best conditions to produce the coating were maturing of 36 h and 35 °C, showing crossover points around 76 Pa and a 0.505 solids particle dispersion (average particle diameter of 0.364 μm), and a fusion point at 20.04 °C with a ΔHf of 1.40 (J/g). The results indicate that xanthan gum is a good substitute for cocoa butter and provides stability to the final product.

  13. To determine the end point of wet granulation by measuring powder energies and thermal properties.

    PubMed

    Dave, Rutesh H; Wu, Stephen H; Contractor, Labdhi D

    2012-04-01

    Wet granulation has been widely used in pharmaceutical industry as a tablet manufacturing process. However, end-point determination of wet granulation process has always remained a challenge. Many traditional methods are available for end-point determination, yet accuracy and reproducibility still remain a challenge. Microcrystalline cellulose, widely used as an excipient in pharmaceutical industry, was granulated using water. Wet mass was passed through sieve # 12 and dried till constant percentage loss on drying was obtained and dried granules were obtained. Wet and dried granules collected were subjected to basic flow energy, specific energy, bulk density, pressure drop, differential scanning calorimetry and effusivity measurements. Analysis of data revealed various stages of granule growth from initial seed formation by adding 200-400 g of water, granule growth was observed by adding 600-800 g of water and over wetting was observed at 1155 g of water. In this work, we have justified our work to properly identify and utilize this technique for practical purpose to correctly identify the end-point determination of microcrystalline cellulose and explain various principles underlying energies associated with powder and thermal measurements.

  14. Exposure assessment and process sensitivity analysis of the contamination of Campylobacter in poultry products.

    PubMed

    Osiriphun, S; Iamtaweejaloen, P; Kooprasertying, P; Koetsinchai, W; Tuitemwong, K; Erickson, L E; Tuitemwong, P

    2011-07-01

    Studies were conducted in a Thai poultry plant to identify the factors that affected numbers of Campylobacter jejuni in chicken carcasses. The concentrations of Campylobacter were determined using the SimPlate most probable number and modified charcoal cefoperazone deoxycholate plating methods. Results indicated that the mean concentrations of C. jejuni in carcasses after scalding, plucking, and chilling were 2.93 ± 0.31, 2.98 ± 0.38, 2.88 ± 0.31, and 0.85 ± 0.95 log cfu, whereas the concentrations of C. jejuni in the scalding tank water, plucked feathers, and chicken breast portion were 1.39 ± 0.70, 3.28 ± 0.52, and 0.50 ± 1.22 log cfu, respectively. Sensitivity analysis using tornado order correlation analysis showed that risk parameters affecting the contamination of C. jejuni in the chicken slaughter and processing plant could be ranked as chilling water pH, number of pathogens in the scald tank water, scalding water temperature, number of C. jejuni on plucked feathers, and residual chlorine in the chill water, respectively. The exposure assessment and analysis of process parameters indicated that some of the current critical control points were not effective. The suggested interventions included preventing fecal contamination during transportation; increasing the scalding temperature, giving the scalding water a higher countercurrent flow rate; reducing contamination of feathers in the scalding tank to decrease C. jejuni in the scalding water; spraying water to reduce contamination at the plucking step; monitoring and maintaining the chill water pH at 6.0 to 6.5; and increasing the residual chlorine in the chill water. These interventions were recommended for inclusion in the hazard analysis and critical control point plan of the plant.

  15. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    NASA Astrophysics Data System (ADS)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  16. An advanced software suite for the processing and analysis of silicon luminescence images

    NASA Astrophysics Data System (ADS)

    Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.

    2017-06-01

    Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.

  17. Investigating selective transport and abrasion on an alluvial fan using quantitative grain size and shape analysis

    NASA Astrophysics Data System (ADS)

    Litwin, K. L.; Jerolmack, D. J.

    2011-12-01

    Selective sorting and abrasion are the two major fluvial processes that are attributed to the downstream fining of sediments in rivers and alluvial fans. Selective transport is the process by which smaller grains are preferentially transported downstream while larger grains are deposited closer to the source. Abrasion is defined by the production of fine sediments and sand that occurs by saltation of gravel, where particle-to-particle collisions supply the energy required to break apart grains. We hypothesize that abrasion results in the gradual fining of large grains and the production of fine sands and silts, while sorting accounts for the differences in transport of these two grain-size fractions produced from abrasion, thereby creating the abrupt gravel-sand transition observed in many channel systems. In this research, we explore both selective transport and abrasion processes on the Dog Canyon alluvial fan near Alamogordo, New Mexico. We complete an extensive grain size analysis down the main channel of the fan employing an image-based technique that utilizes an autocorrelation process. We also characterize changes in grain shape using standard shape parameters, as well as Fourier analysis, which allows the study of contributions of grain roughness on a variety of length scales. Sorting appears to dominate the upper portion of the fan; the grain-size distribution narrows moving downstream until reaching a point of equal mobility, at which point sorting ceases. Abrasion exerts a subtle but persistent effect on grains during transport down the fan. Shape analysis reveals that particles become more rounded by the removal of small-scale textural features, a process that is expected to only modestly influence grain size of gravel, but should produce significant quantities of sand. This study provides a better understanding of the importance of grain abrasion and sorting on the downstream fining of channel grains in an alluvial fan, as well as an improved knowledge about the abrupt gravel-sand transition observed in a majority of alluvial fans.

  18. Uncertainty Propagation for Terrestrial Mobile Laser Scanner

    NASA Astrophysics Data System (ADS)

    Mezian, c.; Vallet, Bruno; Soheilian, Bahman; Paparoditis, Nicolas

    2016-06-01

    Laser scanners are used more and more in mobile mapping systems. They provide 3D point clouds that are used for object reconstruction and registration of the system. For both of those applications, uncertainty analysis of 3D points is of great interest but rarely investigated in the literature. In this paper we present a complete pipeline that takes into account all the sources of uncertainties and allows to compute a covariance matrix per 3D point. The sources of uncertainties are laser scanner, calibration of the scanner in relation to the vehicle and direct georeferencing system. We suppose that all the uncertainties follow the Gaussian law. The variances of the laser scanner measurements (two angles and one distance) are usually evaluated by the constructors. This is also the case for integrated direct georeferencing devices. Residuals of the calibration process were used to estimate the covariance matrix of the 6D transformation between scanner laser and the vehicle system. Knowing the variances of all sources of uncertainties, we applied uncertainty propagation technique to compute the variance-covariance matrix of every obtained 3D point. Such an uncertainty analysis enables to estimate the impact of different laser scanners and georeferencing devices on the quality of obtained 3D points. The obtained uncertainty values were illustrated using error ellipsoids on different datasets.

  19. Singularity Analysis: a powerful image processing tool in remote sensing of the oceans

    NASA Astrophysics Data System (ADS)

    Turiel, A.; Umbert, M.; Hoareau, N.; Ballabrera-Poy, J.; Portabella, M.

    2012-04-01

    The study of fully developed turbulence has given rise to the development of new methods to describe real data of scalars submitted to the action of a turbulent flow. The application of this brand of methodologies (known as Microcanonical Multifractal Formalism, MMF) on remote sensing ocean maps open new ways to exploit those data for oceanographic purposes. The main technique in MMF is that of Singularity Analysis (SA). By means of SA a singularity exponents is assigned to each point of a given image. The singularity exponent of a given point is a dimensionless measure of the regularity or irregularity of the scalar at that point. Singularity exponents arrange in singularity lines, which accurately track the flow streamlines from any scalar, as we have verified with remote sensing and simulated data. Applications of SA include quality assessment of different products, the estimation of surface velocities, the development of fusion techniques for different types of scalars, comparison with measures of ocean mixing, and improvement in assimilation schemes.

  20. Four-point functions and the permutation group S4

    NASA Astrophysics Data System (ADS)

    Eichmann, Gernot; Fischer, Christian S.; Heupel, Walter

    2015-09-01

    Four-point functions are at the heart of many interesting physical processes. A prime example is the light-by-light scattering amplitude, which plays an important role in the calculation of hadronic contributions to the anomalous magnetic moment of the muon. In the calculation of such quantities one faces the challenge of finding a suitable and well-behaved basis of tensor structures in coordinate and/or momentum space. Provided all (or many) of the external legs represent similar particle content, a powerful tool to construct and organize such bases is the permutation group S4. We introduce an efficient notation for dealing with the irreducible multiplets of S4, and we highlight the merits of this treatment by exemplifying four-point functions with gauge-boson legs such as the four-gluon vertex and the light-by-light scattering amplitude. The multiplet analysis is also useful for isolating the important kinematic regions and the dynamical singularity content of such amplitudes. Our analysis serves as a basis for future efficient calculations of these and similar objects.

  1. Preparation and Properties of a Novel Microcrystalline Cellulose-Filled Composites Based on Polyamide 6/High-Density Polyethylene

    PubMed Central

    Xu, Shihua; Yi, Shunmin; He, Jun; Wang, Haigang; Fang, Yiqun; Wang, Qingwen

    2017-01-01

    In the present study, lithium chloride (LiCl) was utilized as a modifier to reduce the melting point of polyamide 6 (PA6), and then 15 wt % microcrystalline cellulose (MCC) was compounded with low melting point PA6/high-density polyethylene (HDPE) by hot pressing. Crystallization analysis revealed that as little as 3 wt % LiCl transformed the crystallographic forms of PA6 from semi-crystalline to an amorphous state (melting point: 220 °C to none), which sharply reduced the processing temperature of the composites. LiCl improved the mechanical properties of the composites, as evidenced by the fact that the impact strength of the composites was increased by 90%. HDPE increased the impact strength of PA6/MCC composites. In addition, morphological analysis revealed that incorporation of LiCl and maleic anhydride grafted high-density polyethylene (MAPE) improved the interfacial adhesion. LiCl increased the glass transition temperature of the composites (the maximum is 72.6 °C). PMID:28773169

  2. Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.

    PubMed

    Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan

    2018-06-05

    Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.

  3. Application of Hazard Analysis and Critical Control Points (HACCP) to the Cultivation Line of Mushroom and Other Cultivated Edible Fungi.

    PubMed

    Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo

    2013-09-01

    The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.

  4. Data analysis using scale-space filtering and Bayesian probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter

    1991-01-01

    This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.

  5. Statistical aspects of point count sampling

    USGS Publications Warehouse

    Barker, R.J.; Sauer, J.R.; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demon-strate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the variability in point counts is caused by the incomplete counting, and this within-count variation can be confounded with ecologically meaningful varia-tion. We recommend caution in the analysis of estimates obtained from point counts. Using; our model, we also consider optimal allocation of sampling effort. The critical step in the optimization process is in determining the goals of the study and methods that will be used to meet these goals. By explicitly defining the constraints on sampling and by estimating the relationship between precision and bias of estimators and time spent counting, we can predict the optimal time at a point for each of several monitoring goals. In general, time spent at a point will differ depending on the goals of the study.

  6. Explore Stochastic Instabilities of Periodic Points by Transition Path Theory

    NASA Astrophysics Data System (ADS)

    Cao, Yu; Lin, Ling; Zhou, Xiang

    2016-06-01

    We consider the noise-induced transitions from a linearly stable periodic orbit consisting of T periodic points in randomly perturbed discrete logistic map. Traditional large deviation theory and asymptotic analysis at small noise limit cannot distinguish the quantitative difference in noise-induced stochastic instabilities among the T periodic points. To attack this problem, we generalize the transition path theory to the discrete-time continuous-space stochastic process. In our first criterion to quantify the relative instability among T periodic points, we use the distribution of the last passage location related to the transitions from the whole periodic orbit to a prescribed disjoint set. This distribution is related to individual contributions to the transition rate from each periodic points. The second criterion is based on the competency of the transition paths associated with each periodic point. Both criteria utilize the reactive probability current in the transition path theory. Our numerical results for the logistic map reveal the transition mechanism of escaping from the stable periodic orbit and identify which periodic point is more prone to lose stability so as to make successful transitions under random perturbations.

  7. Do two and three year old children use an incremental first-NP-as-agent bias to process active transitive and passive sentences?: A permutation analysis

    PubMed Central

    Chang, Franklin; Rowland, Caroline; Ferguson, Heather; Pine, Julian

    2017-01-01

    We used eye-tracking to investigate if and when children show an incremental bias to assume that the first noun phrase in a sentence is the agent (first-NP-as-agent bias) while processing the meaning of English active and passive transitive sentences. We also investigated whether children can override this bias to successfully distinguish active from passive sentences, after processing the remainder of the sentence frame. For this second question we used eye-tracking (Study 1) and forced-choice pointing (Study 2). For both studies, we used a paradigm in which participants simultaneously saw two novel actions with reversed agent-patient relations while listening to active and passive sentences. We compared English-speaking 25-month-olds and 41-month-olds in between-subjects sentence structure conditions (Active Transitive Condition vs. Passive Condition). A permutation analysis found that both age groups showed a bias to incrementally map the first noun in a sentence onto an agent role. Regarding the second question, 25-month-olds showed some evidence of distinguishing the two structures in the eye-tracking study. However, the 25-month-olds did not distinguish active from passive sentences in the forced choice pointing task. In contrast, the 41-month-old children did reanalyse their initial first-NP-as-agent bias to the extent that they clearly distinguished between active and passive sentences both in the eye-tracking data and in the pointing task. The results are discussed in relation to the development of syntactic (re)parsing. PMID:29049390

  8. Detecting determinism from point processes.

    PubMed

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  9. Dissociation reactions of potassiated glucose: deionization, potassium hydroxide loss, and cross-ring dissociation

    NASA Astrophysics Data System (ADS)

    Dyakov, Y. A.; Kazaryan, M. A.; Golubkov, M. G.; Gubanova, D. P.; Asratyan, A. A.

    2018-04-01

    Photochemical properties of carbohydrates, including mono- and polysaccharides, as well as various kinds of glycoproteins, proteoglycans, and glycolipids, take great attention last decades due to their significance for clarifying physical and chemical processes happening in biological molecules under irradiation. Understanding of excitation and ionization processes is important for interpretation of mass spectrometric (MS) experiments, which is the main instrument for quick and reliable analysis of biological samples. While polynucleotides and simple proteins can be easily studied by standard MS techniques (MALDI, ESI, and CID), carbohydrates and complicated biomolecules containing oligosaccharide residues are difficult to be ionized. Carbohydrates give a low signal yield. Their detection and analysis requires the special equipment and technology. Therefore, the development of new efficient methods for identification of carbohydrates in biological samples currently is the critical scientific and technical problem. In this work we study dissociation processes taking place in potassiated α- and β-glucose, which can be concerned as the modelling molecule for investigation of wide range of carbohydrates and carbohydrate fragments of biomolecules containing potassium ion as the ionization source. Here we compare deionization process with H2O and KOH elimination channels, as far as their competition with cross-ring dissociation processes. Potential energy surface were optimized by the density functional B3LYP/6-31G* method. Single point energy calculations in minima and transition state points were performed by G3(MP2,CCSD) ab initio method.

  10. Relevance of deterministic chaos theory to studies in functioning of dynamical systems

    NASA Astrophysics Data System (ADS)

    Glagolev, S. N.; Bukhonova, S. M.; Chikina, E. D.

    2018-03-01

    The paper considers chaotic behavior of dynamical systems typical for social and economic processes. Approaches to analysis and evaluation of system development processes are studies from the point of view of controllability and determinateness. Explanations are given for necessity to apply non-standard mathematical tools to explain states of dynamical social and economic systems on the basis of fractal theory. Features of fractal structures, such as non-regularity, self-similarity, dimensionality and fractionality are considered.

  11. Research status of wave energy conversion (WEC) device of raft structure

    NASA Astrophysics Data System (ADS)

    Dong, Jianguo; Gao, Jingwei; Tao, Liang; Zheng, Peng

    2017-10-01

    This paper has briefly described the concept of wave energy generation and six typical conversion devices. As for raft structure, detailed analysis is provided from its development process to typical devices. Taking the design process and working principle of Plamis as an example, the general principle of raft structure is briefly described. After that, a variety of raft structure models are introduced. Finally, the advantages and disadvantages, and development trend of raft structure are pointed out.

  12. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  13. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  14. The effect of coconut oil and palm oil as substituted oils to cocoa butter on chocolate bar texture and melting point

    NASA Astrophysics Data System (ADS)

    Limbardo, Rebecca Putri; Santoso, Herry; Witono, Judy Retti

    2017-05-01

    Cocoa butter has responsibility for dispersion medium to create a stable chocolate bar. Due to the economic reason, cocoa butter is partially or wholly substituted by edible oils e.g palm oil and coconut oil. The objective of the research was to observe the effect of oil substitution in the chocolate bar towards its melting point and texture. The research were divided in three steps which were preliminary research started with fat content analysis in cocoa powder, melting point analysis of substituted oils anc cocoa butter, and iodine number analysis in vegetable fats (cocoa butter, coconut oil, and palm oil), chocolate bar production with substitution 0%, 20%, 40%, 60%, 80%, and 100%wt of cocoa butter with each of substituted oils, and analysis process to determine the chocolate bar melting point with DSC and chocolate bar hardness with texture analyser. The increasement of substituted oils during substitution in chocolate bar would reduce the melting point of chocolate bar from 33.5°C to 31.6°C in palm oil substitution with cocoa butter and 33.5°C to 30.75°C in coconut oil substitution. The hardness of chocolate with palm oil were around 88.5 to 139 g on the 1st cycle and 22.75 to 132 g on the 2nd cycle. The hardness of chocolate with coconut oil were around 74.75 to 152.5 g on the 1st cycle and 53.25 to 132 g on the 2nd cycle. Maximum amount of fats substitution to produce a stable texture chocolate bar is 60% wt.

  15. Teaching by research at undergraduate schools: an experience

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.

    1997-12-01

    On this communication I will report a pedagogical experience undertaken in the 1995 class of Image Processing of the course of Applied Physics of the University of Minho. The learning process requires always an active critical participation of the student in an experience essentially personal that should and must be rewarding and fulfilling. To us scientists virtually nothing gives us more pleasure and fulfillment than the research processes. Furthermore it is our main way to improve our, and I stress our, knowledge. Thus I decided to center my undergraduate students' learning process of the basics of digital image processing on a simple applied research program. The proposed project was to develop a process of inspection to be introduced in a generic production line. Measured should be the transversal distance between an object and the extremity of a conveyor belt where it is transported. The measurement method was proposed to be optical triangulation combined with shadow analysis. To the students was given almost entire liberty and responsibility. I limited my self to asses the development of the project orienting them and point out different or pertinent points of view only when strictly necessary.

  16. The effects of on-screen, point of care computer reminders on processes and outcomes of care

    PubMed Central

    Shojania, Kaveh G; Jennings, Alison; Mayhew, Alain; Ramsay, Craig R; Eccles, Martin P; Grimshaw, Jeremy

    2014-01-01

    Background The opportunity to improve care by delivering decision support to clinicians at the point of care represents one of the main incentives for implementing sophisticated clinical information systems. Previous reviews of computer reminder and decision support systems have reported mixed effects, possibly because they did not distinguish point of care computer reminders from e-mail alerts, computer-generated paper reminders, and other modes of delivering ‘computer reminders’. Objectives To evaluate the effects on processes and outcomes of care attributable to on-screen computer reminders delivered to clinicians at the point of care. Search methods We searched the Cochrane EPOC Group Trials register, MEDLINE, EMBASE and CINAHL and CENTRAL to July 2008, and scanned bibliographies from key articles. Selection criteria Studies of a reminder delivered via a computer system routinely used by clinicians, with a randomised or quasi-randomised design and reporting at least one outcome involving a clinical endpoint or adherence to a recommended process of care. Data collection and analysis Two authors independently screened studies for eligibility and abstracted data. For each study, we calculated the median improvement in adherence to target processes of care and also identified the outcome with the largest such improvement. We then calculated the median absolute improvement in process adherence across all studies using both the median outcome from each study and the best outcome. Main results Twenty-eight studies (reporting a total of thirty-two comparisons) were included. Computer reminders achieved a median improvement in process adherence of 4.2% (interquartile range (IQR): 0.8% to 18.8%) across all reported process outcomes, 3.3% (IQR: 0.5% to 10.6%) for medication ordering, 3.8% (IQR: 0.5% to 6.6%) for vaccinations, and 3.8% (IQR: 0.4% to 16.3%) for test ordering. In a sensitivity analysis using the best outcome from each study, the median improvement was 5.6% (IQR: 2.0% to 19.2%) across all process measures and 6.2% (IQR: 3.0% to 28.0%) across measures of medication ordering. In the eight comparisons that reported dichotomous clinical endpoints, intervention patients experienced a median absolute improvement of 2.5% (IQR: 1.3% to 4.2%). Blood pressure was the most commonly reported clinical endpoint, with intervention patients experiencing a median reduction in their systolic blood pressure of 1.0 mmHg (IQR: 2.3 mmHg reduction to 2.0 mmHg increase). Authors’ conclusions Point of care computer reminders generally achieve small to modest improvements in provider behaviour. A minority of interventions showed larger effects, but no specific reminder or contextual features were significantly associated with effect magnitude. Further research must identify design features and contextual factors consistently associated with larger improvements in provider behaviour if computer reminders are to succeed on more than a trial and error basis. PMID:19588323

  17. Daily precipitation grids for Austria since 1961—development and evaluation of a spatial dataset for hydroclimatic monitoring and modelling

    NASA Astrophysics Data System (ADS)

    Hiebl, Johann; Frei, Christoph

    2018-04-01

    Spatial precipitation datasets that are long-term consistent, highly resolved and extend over several decades are an increasingly popular basis for modelling and monitoring environmental processes and planning tasks in hydrology, agriculture, energy resources management, etc. Here, we present a grid dataset of daily precipitation for Austria meant to promote such applications. It has a grid spacing of 1 km, extends back till 1961 and is continuously updated. It is constructed with the classical two-tier analysis, involving separate interpolations for mean monthly precipitation and daily relative anomalies. The former was accomplished by kriging with topographic predictors as external drift utilising 1249 stations. The latter is based on angular distance weighting and uses 523 stations. The input station network was kept largely stationary over time to avoid artefacts on long-term consistency. Example cases suggest that the new analysis is at least as plausible as previously existing datasets. Cross-validation and comparison against experimental high-resolution observations (WegenerNet) suggest that the accuracy of the dataset depends on interpretation. Users interpreting grid point values as point estimates must expect systematic overestimates for light and underestimates for heavy precipitation as well as substantial random errors. Grid point estimates are typically within a factor of 1.5 from in situ observations. Interpreting grid point values as area mean values, conditional biases are reduced and the magnitude of random errors is considerably smaller. Together with a similar dataset of temperature, the new dataset (SPARTACUS) is an interesting basis for modelling environmental processes, studying climate change impacts and monitoring the climate of Austria.

  18. Operation of a sampling train for the analysis of environmental species in coal gasification gas-phase process streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pochan, M.J.; Massey, M.J.

    1979-02-01

    This report discusses the results of actual raw product gas sampling efforts and includes: Rationale for raw product gas sampling efforts; design and operation of the CMU gas sampling train; development and analysis of a sampling train data base; and conclusions and future application of results. The results of sampling activities at the CO/sub 2/-Acceptor and Hygas pilot plants proved that: The CMU gas sampling train is a valid instrument for characterization of environmental parameters in coal gasification gas-phase process streams; depending on the particular process configuration, the CMU gas sampling train can reduce gasifier effluent characterization activity to amore » single location in the raw product gas line; and in contrast to the slower operation of the EPA SASS Train, CMU's gas sampling train can collect representative effluent data at a rapid rate (approx. 2 points per hour) consistent with the rate of change of process variables, and thus function as a tool for process engineering-oriented analysis of environmental characteristics.« less

  19. Experimental study and numerical simulation of the salinity effect on water-freezing point and ice-melting rate

    NASA Astrophysics Data System (ADS)

    Qin, N.; Wu, Y.; Wang, H. W.; Wang, Y. Y.

    2017-12-01

    In this paper, based on the background of snowmelt de-icing tools, we studied the effect of salt on freezing point and melting rate of ice through laboratory test and FLUENT numerical simulation analysis. It was confirmed that the freezing point is inversely proportional to the salt solid content, and with the salt solid content increasing, the freezing process of salt water gradually accepts the curing rule of non-crystal solids. At the same temperature, an increase in the salt solid content, the ice melting rate increase by the empirical formula linking the melting time with temperature and salt content. The theoretical aspects of solid/fluid transformation are discussed in detail.

  20. Assessment of documentation requirements under DOE 5481. 1, Safety Analysis and Review System (SARS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, E.T.

    1981-03-01

    This report assesses the requirements of DOE Order 5481.1, Safety Analysis and Review System for DOE Operations (SARS) in regard to maintaining SARS documentation. Under SARS, all pertinent details of the entire safety analysis and review process for each DOE operation are to be traceable from the initial identification of a hazard. This report is intended to provide assistance in identifying the points in the SARS cycle at which documentation is required, what type of documentation is most appropriate, and where it ultimately should be maintained.

  1. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  2. Language Awareness as a Challenge for Business

    ERIC Educational Resources Information Center

    Hunerberg, Reinhard; Geile, Andrea

    2012-01-01

    The following contribution is a meta-analysis of the language awareness discipline from a practical application point of view. It is based on a keynote speech at the "10th International Conference of the Association for Language Awareness," and deals with the implications of business requirements for language use in communication processes. The…

  3. Web Services as Public Services: Are We Supporting Our Busiest Service Point?

    ERIC Educational Resources Information Center

    Riley-Huff, Debra A.

    2009-01-01

    This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)

  4. An Analysis of Informal Reasoning Fallacy and Critical Thinking Dispositions among Malaysian Undergraduates

    ERIC Educational Resources Information Center

    Ramasamy, Shamala

    2011-01-01

    In this information age, the amount of complex information available due to technological advancement would require undergraduates to be extremely competent in processing information systematically. Critical thinking ability of undergraduates has been the focal point among educators, employers and the public at large. One of the dimensions of…

  5. Developing a Framework of Facilitator Competencies: Lessons from the Field

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon

    2008-01-01

    People in organizations are increasingly called upon to serve as small group facilitators or to assist in this role. This article uses data collected from practicing facilitators at three points of time and a building block process of collection, analysis, further collection, and consolidation to develop and refine a list of competencies. A…

  6. 76 FR 4918 - Agency Information Collection Activities; Announcement of Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-27

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357...; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug...

  7. 78 FR 47701 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control designed to help...

  8. 75 FR 18211 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Importing of Fish and Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control...

  9. Sales Assistants in the Making: Learning through Responsibility

    ERIC Educational Resources Information Center

    Reegård, Kaja

    2015-01-01

    The paper investigates how learning and processes of becoming are shaped and enacted in retail apprenticeship in Norway. The analysis draws upon a qualitative study of managers and apprentices in different retail sub-sectors. The empirical point of departure is managers who, more or less deliberately, throw apprentices into tasks from day one.…

  10. Infant Stimulation and the Etiology of Cognitive Processes.

    ERIC Educational Resources Information Center

    Fowler, William

    What data, problems, and concepts are most relevant in determining the role of stimulation in human development? A critical analysis of the relationships between long term stimulation, behavior, and cognitive functioning and development points up biases and gaps in past as well as contemporary approaches. Each of the four sections of this paper…

  11. Determination of Steering Wheel Angles during CAR Alignment by Image Analysis Methods

    NASA Astrophysics Data System (ADS)

    Mueller, M.; Voegtle, T.

    2016-06-01

    Optical systems for automatic visual inspections are of increasing importance in the field of automation in the industrial domain. A new application is the determination of steering wheel angles during wheel track setting of the final inspection of car manufacturing. The camera has to be positioned outside the car to avoid interruptions of the processes and therefore, oblique images of the steering wheel must be acquired. Three different approaches of computer vision are considered in this paper, i.e. a 2D shape-based matching (by means of a plane to plane rectification of the oblique images and detection of a shape model with a particular rotation), a 3D shape-based matching approach (by means of a series of different perspectives of the spatial shape of the steering wheel derived from a CAD design model) and a point-to-point matching (by means of the extraction of significant elements (e.g. multifunctional buttons) of a steering wheel and a pairwise connection of these points to straight lines). The HALCON system (HALCON, 2016) was used for all software developments and necessary adaptions. As reference a mechanical balance with an accuracy of 0.1° was used. The quality assessment was based on two different approaches, a laboratory test and a test during production process. In the laboratory a standard deviation of ±0.035° (2D shape-based matching), ±0.12° (3D approach) and ±0.029° (point-to-point matching) could be obtained. The field test of 291 measurements (27 cars with varying poses and angles of the steering wheel) results in a detection rate of 100% and ±0.48° (2D matching) and ±0.24° (point-to-point matching). Both methods also fulfil the request of real time processing (three measurements per second).

  12. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    NASA Astrophysics Data System (ADS)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  13. Identification of the critical depth-of-cut through a 2D image of the cutting region resulting from taper cutting of brittle materials

    NASA Astrophysics Data System (ADS)

    Gu, Wen; Zhu, Zhiwei; Zhu, Wu-Le; Lu, Leyao; To, Suet; Xiao, Gaobo

    2018-05-01

    An automatic identification method for obtaining the critical depth-of-cut (DoC) of brittle materials with nanometric accuracy and sub-nanometric uncertainty is proposed in this paper. With this method, a two-dimensional (2D) microscopic image of the taper cutting region is captured and further processed by image analysis to extract the margin of generated micro-cracks in the imaging plane. Meanwhile, an analytical model is formulated to describe the theoretical curve of the projected cutting points on the imaging plane with respect to a specified DoC during the whole cutting process. By adopting differential evolution algorithm-based minimization, the critical DoC can be identified by minimizing the deviation between the extracted margin and the theoretical curve. The proposed method is demonstrated through both numerical simulation and experimental analysis. Compared with conventional 2D- and 3D-microscopic-image-based methods, determination of the critical DoC in this study uses the envelope profile rather than the onset point of the generated cracks, providing a more objective approach with smaller uncertainty.

  14. Phase-plane analysis of the totally asymmetric simple exclusion process with binding kinetics and switching between antiparallel lanes

    PubMed Central

    Kuan, Hui-Shun; Betterton, Meredith D.

    2016-01-01

    Motor protein motion on biopolymers can be described by models related to the totally asymmetric simple exclusion process (TASEP). Inspired by experiments on the motion of kinesin-4 motors on antiparallel microtubule overlaps, we analyze a model incorporating the TASEP on two antiparallel lanes with binding kinetics and lane switching. We determine the steady-state motor density profiles using phase-plane analysis of the steady-state mean field equations and kinetic Monte Carlo simulations. We focus on the density-density phase plane, where we find an analytic solution to the mean field model. By studying the phase-space flows, we determine the model’s fixed points and their changes with parameters. Phases previously identified for the single-lane model occur for low switching rate between lanes. We predict a multiple coexistence phase due to additional fixed points that appear as the switching rate increases: switching moves motors from the higher-density to the lower-density lane, causing local jamming and creating multiple domain walls. We determine the phase diagram of the model for both symmetric and general boundary conditions. PMID:27627345

  15. Complexity and multifractal behaviors of multiscale-continuum percolation financial system for Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Zeng, Yayun; Wang, Jun; Xu, Kaixuan

    2017-04-01

    A new financial agent-based time series model is developed and investigated by multiscale-continuum percolation system, which can be viewed as an extended version of continuum percolation system. In this financial model, for different parameters of proportion and density, two Poisson point processes (where the radii of points represent the ability of receiving or transmitting information among investors) are applied to model a random stock price process, in an attempt to investigate the fluctuation dynamics of the financial market. To validate its effectiveness and rationality, we compare the statistical behaviors and the multifractal behaviors of the simulated data derived from the proposed model with those of the real stock markets. Further, the multiscale sample entropy analysis is employed to study the complexity of the returns, and the cross-sample entropy analysis is applied to measure the degree of asynchrony of return autocorrelation time series. The empirical results indicate that the proposed financial model can simulate and reproduce some significant characteristics of the real stock markets to a certain extent.

  16. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  17. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  18. Investigation of Polyurethane Electrospinning Process Efficiency

    NASA Astrophysics Data System (ADS)

    Kimmer, Dusan; Zatloukal, Martin; Petras, David; Vincent, Ivo; Slobodian, Petr

    2009-07-01

    The electrospinning process efficiency of different PUs has been investigated. Specific attention has been paid to understand the role of PU soft segments and synthesis type on the stability of the PU solution and electrospinning process as well as on the quality/property changes of the produced nanofibres. PU samples before and after the process were analyzed rheologicaly and relaxation spectra were determined for all of them from frequency dependent loss and storage moduli measurements. It has been found that rheological analysis of PU, which is used for electrospinning process, can be useful tool from electrospinning process efficiency and optimization point of view. Nanolayers homogeneity during several hours of manufacture in optimized electrospinning process is proved by selected properties from aerosol filtration.

  19. Effect of processing on Polymer/Composite structure and properties

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Advances in the vitality and economic health of the field of polymer forecasting are discussed. A consistent and rational point of view which considers processing as a participant in the underlying triad of relationships which comprise materials science and engineering is outlined. This triad includes processing as it influences material structure, and ultimately properties. Methods in processing structure properties, polymer science and engineering, polymer chemistry and synthesis, structure and modification and optimization through processing, and methods of melt flow modeling in processing structure property relations of polymer were developed. Mechanical properties of composites are considered, and biomedical materials research to include polymer processing effects are studied. An analysis of the design technology of advances graphite/epoxy composites is also reported.

  20. Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data

    PubMed Central

    Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D.; Nichols, Thomas E.

    2017-01-01

    Summary Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterised as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. PMID:28498564

  1. Development of Optical System for ARGO-M

    NASA Astrophysics Data System (ADS)

    Nah, Jakyoung; Jang, Jung-Guen; Jang, Bi-Ho; Han, In-Woo; Han, Jeong-Yeol; Park, Kwijong; Lim, Hyung-Chul; Yu, Sung-Yeol; Park, Eunseo; Seo, Yoon-Kyung; Moon, Il-Kwon; Choi, Byung-Kyu; Na, Eunjoo; Nam, Uk-Won

    2013-03-01

    ARGO-M is a satellite laser ranging (SLR) system developed by the Korea Astronomy and Space Science Institute with the consideration of mobility and daytime and nighttime satellite observation. The ARGO-M optical system consists of 40 cm receiving telescope, 10 cm transmitting telescope, and detecting optics. For the development of ARGO-M optical system, the structural analysis was performed with regard to the optics and optomechanics design and the optical components. To ensure the optical performance, the quality was tested at the level of parts using the laser interferometer and ultra-high-precision measuring instruments. The assembly and alignment of ARGO-M optical system were conducted at an auto-collimation facility. As the transmission and reception are separated in the ARGO-M optical system, the pointing alignment between the transmitting telescope and receiving telescope is critical for precise target pointing. Thus, the alignment using the ground target and the radiant point observation of transmitting laser beam was carried out, and the lines of sight for the two telescopes were aligned within the required pointing precision. This paper describes the design, structural analysis, manufacture and assembly of parts, and entire process related with the alignment for the ARGO-M optical system.

  2. Anatomy-based image processing analysis of the running pattern of the perioral artery for minimally invasive surgery.

    PubMed

    Lee, Sang-Hee; Lee, Minho; Kim, Hee-Jin

    2014-10-01

    We aimed to elucidate the tortuous course of the perioral artery with the aid of image processing, and to suggest accurate reference points for minimally invasive surgery. We used 59 hemifaces from 19 Korean and 20 Thai cadavers. A perioral line was defined to connect the point at which the facial artery emerged on the mandibular margin, and the ramification point of the lateral nasal artery and the inferior alar branch. The course of the perioral artery was reproduced as a graph based on the perioral line and analysed by adding the image of the artery using MATLAB. The course of the artery could be classified into 2 according to the course of the alar branch - oblique and vertical. Two distinct inflection points appeared in the course of the artery along the perioral line at the ramification points of the alar branch and the inferior labial artery, respectively, and the course of the artery across the face can be predicted based on the following references: the perioral line, the ramification point of the alar branch (5∼10 mm medial to the perioral line at the level of the lower third of the upper lip) and the inferior labial artery (5∼10 mm medial to the perioral line at the level of the middle of the lower lip). Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  3. Semantic point cloud interpretation based on optimal neighborhoods, relevant features and efficient classifiers

    NASA Astrophysics Data System (ADS)

    Weinmann, Martin; Jutzi, Boris; Hinz, Stefan; Mallet, Clément

    2015-07-01

    3D scene analysis in terms of automatically assigning 3D points a respective semantic label has become a topic of great importance in photogrammetry, remote sensing, computer vision and robotics. In this paper, we address the issue of how to increase the distinctiveness of geometric features and select the most relevant ones among these for 3D scene analysis. We present a new, fully automated and versatile framework composed of four components: (i) neighborhood selection, (ii) feature extraction, (iii) feature selection and (iv) classification. For each component, we consider a variety of approaches which allow applicability in terms of simplicity, efficiency and reproducibility, so that end-users can easily apply the different components and do not require expert knowledge in the respective domains. In a detailed evaluation involving 7 neighborhood definitions, 21 geometric features, 7 approaches for feature selection, 10 classifiers and 2 benchmark datasets, we demonstrate that the selection of optimal neighborhoods for individual 3D points significantly improves the results of 3D scene analysis. Additionally, we show that the selection of adequate feature subsets may even further increase the quality of the derived results while significantly reducing both processing time and memory consumption.

  4. ICESat Laser Altimeter Pointing, Ranging and Timing Calibration from Integrated Residual Analysis: A Summary of Early Mission Results

    NASA Technical Reports Server (NTRS)

    Lutchke, Scott B.; Rowlands, David D.; Harding, David J.; Bufton, Jack L.; Carabajal, Claudia C.; Williams, Teresa A.

    2003-01-01

    On January 12, 2003 the Ice, Cloud and land Elevation Satellite (ICESat) was successfUlly placed into orbit. The ICESat mission carries the Geoscience Laser Altimeter System (GLAS), which consists of three near-infrared lasers that operate at 40 short pulses per second. The instrument has collected precise elevation measurements of the ice sheets, sea ice roughness and thickness, ocean and land surface elevations and surface reflectivity. The accurate geolocation of GLAS's surface returns, the spots from which the laser energy reflects on the Earth's surface, is a critical issue in the scientific application of these data Pointing, ranging, timing and orbit errors must be compensated to accurately geolocate the laser altimeter surface returns. Towards this end, the laser range observations can be fully exploited in an integrated residual analysis to accurately calibrate these geolocation/instrument parameters. Early mission ICESat data have been simultaneously processed as direct altimetry from ocean sweeps along with dynamic crossovers resulting in a preliminary calibration of laser pointing, ranging and timing. The calibration methodology and early mission analysis results are summarized in this paper along with future calibration activities

  5. Foveal analysis and peripheral selection during active visual sampling

    PubMed Central

    Ludwig, Casimir J. H.; Davies, J. Rhys; Eckstein, Miguel P.

    2014-01-01

    Human vision is an active process in which information is sampled during brief periods of stable fixation in between gaze shifts. Foveal analysis serves to identify the currently fixated object and has to be coordinated with a peripheral selection process of the next fixation location. Models of visual search and scene perception typically focus on the latter, without considering foveal processing requirements. We developed a dual-task noise classification technique that enables identification of the information uptake for foveal analysis and peripheral selection within a single fixation. Human observers had to use foveal vision to extract visual feature information (orientation) from different locations for a psychophysical comparison. The selection of to-be-fixated locations was guided by a different feature (luminance contrast). We inserted noise in both visual features and identified the uptake of information by looking at correlations between the noise at different points in time and behavior. Our data show that foveal analysis and peripheral selection proceeded completely in parallel. Peripheral processing stopped some time before the onset of an eye movement, but foveal analysis continued during this period. Variations in the difficulty of foveal processing did not influence the uptake of peripheral information and the efficacy of peripheral selection, suggesting that foveal analysis and peripheral selection operated independently. These results provide important theoretical constraints on how to model target selection in conjunction with foveal object identification: in parallel and independently. PMID:24385588

  6. Accuracy assessment/validation methodology and results of 2010–11 land-cover/land-use data for Pools 13, 26, La Grange, and Open River South, Upper Mississippi River System

    USGS Publications Warehouse

    Jakusz, J.W.; Dieck, J.J.; Langrehr, H.A.; Ruhser, J.J.; Lubinski, S.J.

    2016-01-11

    Similar to an AA, validation involves generating random points based on the total area for each map class. However, instead of collecting field data, two or three individuals not involved with the photo-interpretative mapping separately review each of the points onscreen and record a best-fit vegetation type(s) for each site. Once the individual analyses are complete, results are joined together and a comparative analysis is performed. The objective of this initial analysis is to identify areas where the validation results were in agreement (matches) and areas where validation results were in disagreement (mismatches). The two or three individuals then perform an analysis, looking at each mismatched site, and agree upon a final validation class. (If two vegetation types at a specific site appear to be equally prevalent, the validation team is permitted to assign the site two best-fit vegetation types.) Following the validation team’s comparative analysis of vegetation assignments, the data are entered into a database and compared to the mappers’ vegetation assignments. Agreements and disagreements between the map and validation classes are identified, and a contingency table is produced. This document presents the AA processes/results for Pools 13 and La Grange, as well as the validation process/results for Pools 13 and 26 and Open River South.

  7. Advanced analysis of forest fire clustering

    NASA Astrophysics Data System (ADS)

    Kanevski, Mikhail; Pereira, Mario; Golay, Jean

    2017-04-01

    Analysis of point pattern clustering is an important topic in spatial statistics and for many applications: biodiversity, epidemiology, natural hazards, geomarketing, etc. There are several fundamental approaches used to quantify spatial data clustering using topological, statistical and fractal measures. In the present research, the recently introduced multi-point Morisita index (mMI) is applied to study the spatial clustering of forest fires in Portugal. The data set consists of more than 30000 fire events covering the time period from 1975 to 2013. The distribution of forest fires is very complex and highly variable in space. mMI is a multi-point extension of the classical two-point Morisita index. In essence, mMI is estimated by covering the region under study by a grid and by computing how many times more likely it is that m points selected at random will be from the same grid cell than it would be in the case of a complete random Poisson process. By changing the number of grid cells (size of the grid cells), mMI characterizes the scaling properties of spatial clustering. From mMI, the data intrinsic dimension (fractal dimension) of the point distribution can be estimated as well. In this study, the mMI of forest fires is compared with the mMI of random patterns (RPs) generated within the validity domain defined as the forest area of Portugal. It turns out that the forest fires are highly clustered inside the validity domain in comparison with the RPs. Moreover, they demonstrate different scaling properties at different spatial scales. The results obtained from the mMI analysis are also compared with those of fractal measures of clustering - box counting and sand box counting approaches. REFERENCES Golay J., Kanevski M., Vega Orozco C., Leuenberger M., 2014: The multipoint Morisita index for the analysis of spatial patterns. Physica A, 406, 191-202. Golay J., Kanevski M. 2015: A new estimator of intrinsic dimension based on the multipoint Morisita index. Pattern Recognition, 48, 4070-4081.

  8. Acoustic emission monitoring of concrete columns and beams strengthened with fiber reinforced polymer sheets

    NASA Astrophysics Data System (ADS)

    Ma, Gao; Li, Hui; Zhou, Wensong; Xian, Guijun

    2012-04-01

    Acoustic emission (AE) technique is an effective method in the nondestructive testing (NDT) field of civil engineering. During the last two decades, Fiber reinforced polymer (FRP) has been widely used in repairing and strengthening concrete structures. The damage state of FRP strengthened concrete structures has become an important issue during the service period of the structure and it is a meaningful work to use AE technique as a nondestructive method to assess its damage state. The present study reports AE monitoring results of axial compression tests carried on basalt fiber reinforced polymer (BFRP) confined concrete columns and three-point-bending tests carried on BFRP reinforced concrete beams. AE parameters analysis was firstly utilized to give preliminary results of the concrete fracture process of these specimens. It was found that cumulative AE events can reflect the fracture development trend of both BFRP confined concrete columns and BFRP strengthened concrete beams and AE events had an abrupt increase at the point of BFRP breakage. Then the fracture process of BFRP confined concrete columns and BFRP strengthened concrete beams was studied through RA value-average frequency analysis. The RA value-average frequency tendencies of BFRP confined concrete were found different from that of BFRP strengthened concrete beams. The variation tendency of concrete crack patterns during the loading process was revealed.

  9. A comprehensive multiphonon spectral analysis in MoS2

    NASA Astrophysics Data System (ADS)

    Livneh, Tsachi; Spanier, Jonathan E.

    2015-09-01

    We present a comprehensive multiphonon Raman and complementary infrared analysis for bulk and monolayer MoS2. For the bulk the analysis consists of symmetry assignment from which we obtain a broad set of allowed second-order transitions at the high symmetry M, K and Γ Brillouin zone (BZ) points. The attribution of about 80 transitions of up to fifth order processes are proposed in the low temperature (95 K) resonant Raman spectrum measured with excitation energy of 1.96 eV, which is slightly shifted in energy from the A exciton. We propose that the main contributions come from four phonons: A1g (M), E12g (M1), E22g (M1) (TA‧ (M)) and E22g (M2) (LA‧ (M)). The last three are single degenerate phonons at M with an origin of the E12g (Γ) and E22g (Γ) phonons. Among the four phonons, we identify in the resonant Raman spectra all (but one) of the second-order overtones, combination and difference-bands and many of the third order bands. Consistent with the expectation that at the M point only combinations with the same inversion symmetry (g or u) are Raman-allowed, the contribution of combinations with the longitudinal acoustic (LA(M)) mode can not be considered with the above four phonons. Although minor, contributions from K point and possibly Γ-point phonons are also evident. The ‘2LA band’, measured at ˜460 cm-1 is reassigned. Supported by the striking similarity between this band, measured under off-resonant conditions, and recently published two phonon density of states, we explain the lower part of the band, previously attributed to 2LA(M), as being due to a van Hove singularity between K and M. The higher part, previously attributed exclusively to the A2u (Γ) phonon, is mostly due to the LA and LA‧ phonons at M. For the monolayer MoS2 the second-order phonon processes from the M and Γ BZ points are also analyzed and are discussed within similar framework to that of the bulk.

  10. Multilevel principal component analysis (mPCA) in shape analysis: A feasibility study in medical and dental imaging.

    PubMed

    Farnell, D J J; Popat, H; Richmond, S

    2016-06-01

    Methods used in image processing should reflect any multilevel structures inherent in the image dataset or they run the risk of functioning inadequately. We wish to test the feasibility of multilevel principal components analysis (PCA) to build active shape models (ASMs) for cases relevant to medical and dental imaging. Multilevel PCA was used to carry out model fitting to sets of landmark points and it was compared to the results of "standard" (single-level) PCA. Proof of principle was tested by applying mPCA to model basic peri-oral expressions (happy, neutral, sad) approximated to the junction between the mouth/lips. Monte Carlo simulations were used to create this data which allowed exploration of practical implementation issues such as the number of landmark points, number of images, and number of groups (i.e., "expressions" for this example). To further test the robustness of the method, mPCA was subsequently applied to a dental imaging dataset utilising landmark points (placed by different clinicians) along the boundary of mandibular cortical bone in panoramic radiographs of the face. Changes of expression that varied between groups were modelled correctly at one level of the model and changes in lip width that varied within groups at another for the Monte Carlo dataset. Extreme cases in the test dataset were modelled adequately by mPCA but not by standard PCA. Similarly, variations in the shape of the cortical bone were modelled by one level of mPCA and variations between the experts at another for the panoramic radiographs dataset. Results for mPCA were found to be comparable to those of standard PCA for point-to-point errors via miss-one-out testing for this dataset. These errors reduce with increasing number of eigenvectors/values retained, as expected. We have shown that mPCA can be used in shape models for dental and medical image processing. mPCA was found to provide more control and flexibility when compared to standard "single-level" PCA. Specifically, mPCA is preferable to "standard" PCA when multiple levels occur naturally in the dataset. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Comparison of the basin-scale effect of dredging operations and natural estuarine processes on suspended sediment concentration

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2002-01-01

    Suspended sediment concentration (SSC) data from San Pablo Bay, California, were analyzed to compare the basin-scale effect of dredging and disposal of dredged material (dredging operations) and natural estuarine processes. The analysis used twelve 3-wk to 5-wk periods of mid-depth and near-bottom SSC data collected at Point San Pablo every 15 min from 1993-1998. Point San Pablo is within a tidal excursion of a dredged-material disposal site. The SSC data were compared to dredging volume, Julian day, and hydrodynamic and meteorological variables that could affect SSC. Kendall's ??, Spearman's ??, and weighted (by the fraction of valid data in each period) Spearman's ??w correlation coefficients of the variables indicated which variables were significantly correlated with SSC. Wind-wave resuspension had the greatest effect on SSC. Median water-surface elevation was the primary factor affecting mid-depth SSC. Greater depths inhibit wind-wave resuspension of bottom sediment and indicate greater influence of less turbid water from down estuary. Seasonal variability in the supply of erodible sediment is the primary factor affecting near-bottom SSC. Natural physical processes in San Pablo Bay are more areally extensive, of equal or longer duration, and as frequent as dredging operations (when occurring), and they affect SSC at the tidal time scale. Natural processes control SSC at Point San Pablo even when dredging operations are occurring.

  12. Process analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy.

    PubMed

    Fink, Herbert; Panne, Ulrich; Niessner, Reinhard

    2002-09-01

    An experimental setup for direct elemental analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy (LIPS, or laser-induced breakdown spectroscopy, LIBS) was realized. The combination of a echelle spectrograph, featuring a high resolution with a broad spectral coverage, with multivariate methods, such as PLS, PCR, and variable subset selection via a genetic algorithm, resulted in considerable improvements in selectivity and sensitivity for this complex matrix. With a normalization to carbon as internal standard, the limits of detection were in the ppm range. A preliminary pattern recognition study points to the possibility of polymer recognition via the line-rich echelle spectra. Several experiments at an extruder within a recycling plant demonstrated successfully the capability of LIPS for different kinds of routine on-line process analysis.

  13. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  14. On-Line Robust Modal Stability Prediction using Wavelet Processing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification has been used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins is reduced with parametric and nonparametric time- frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data is used to reduce the effects of external disturbances and unmodeled dynamics. Parametric estimates of modal stability are also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. The F-18 High Alpha Research Vehicle aeroservoelastic flight test data demonstrates improved robust stability prediction by extension of the stability boundary beyond the flight regime. Guidelines and computation times are presented to show the efficiency and practical aspects of these procedures for on-line implementation. Feasibility of the method is shown for processing flight data from time- varying nonstationary test points.

  15. Evaluation and testing of image quality of the Space Solar Extreme Ultraviolet Telescope

    NASA Astrophysics Data System (ADS)

    Peng, Jilong; Yi, Zhong; Zhou, Shuhong; Yu, Qian; Hou, Yinlong; Wang, Shanshan

    2018-01-01

    For the space solar extreme ultraviolet telescope, the star point test can not be performed in the x-ray band (19.5nm band) as there is not light source of bright enough. In this paper, the point spread function of the optical system is calculated to evaluate the imaging performance of the telescope system. Combined with the actual processing surface error, such as small grinding head processing and magnetorheological processing, the optical design software Zemax and data analysis software Matlab are used to directly calculate the system point spread function of the space solar extreme ultraviolet telescope. Matlab codes are programmed to generate the required surface error grid data. These surface error data is loaded to the specified surface of the telescope system by using the communication technique of DDE (Dynamic Data Exchange), which is used to connect Zemax and Matlab. As the different processing methods will lead to surface error with different size, distribution and spatial frequency, the impact of imaging is also different. Therefore, the characteristics of the surface error of different machining methods are studied. Combining with its position in the optical system and simulation its influence on the image quality, it is of great significance to reasonably choose the processing technology. Additionally, we have also analyzed the relationship between the surface error and the image quality evaluation. In order to ensure the final processing of the mirror to meet the requirements of the image quality, we should choose one or several methods to evaluate the surface error according to the different spatial frequency characteristics of the surface error.

  16. On the performance of metrics to predict quality in point cloud representations

    NASA Astrophysics Data System (ADS)

    Alexiou, Evangelos; Ebrahimi, Touradj

    2017-09-01

    Point clouds are a promising alternative for immersive representation of visual contents. Recently, an increased interest has been observed in the acquisition, processing and rendering of this modality. Although subjective and objective evaluations are critical in order to assess the visual quality of media content, they still remain open problems for point cloud representation. In this paper we focus our efforts on subjective quality assessment of point cloud geometry, subject to typical types of impairments such as noise corruption and compression-like distortions. In particular, we propose a subjective methodology that is closer to real-life scenarios of point cloud visualization. The performance of the state-of-the-art objective metrics is assessed by considering the subjective scores as the ground truth. Moreover, we investigate the impact of adopting different test methodologies by comparing them. Advantages and drawbacks of every approach are reported, based on statistical analysis. The results and conclusions of this work provide useful insights that could be considered in future experimentation.

  17. [A descriptive analysis of cognitive activities involved in locating fishing points: In the case of fishermen of Toyoshima Island, Hiroshima Prefecture].

    PubMed

    Sawada, H

    1995-10-01

    This study aimed at descriptive understanding of traditional methods involved in locating fishing points and navigating to them in the sea, and investigate associated cognitive activities. Participant observations and interviews were conducted for more than 30 fishermen who employed hand-line or long-line fishing methods near Toyoshima Island, Hiroshima Prefecture. The main findings were: (1) Fishermen readily perceived environmental cues when locating fishing points, which enabled them to navigate to a correct point on the sea. (2) Their memory of fishing points was not verbal, but visual, directly tied to the cue perception, and was constantly renewed during fishing activities. (3) They grasped configurations of various natural conditions (e.g., swiftness of the tide, surface structure of the sea bottom) through tactile information from the fishing line, and comprehended their surroundings with accumulated knowledge and inductive inferences. And (4) their cognitive processes of perception, memory, and understanding were functionally coordinated in the series of fishing work.

  18. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  19. Some Applications of Fuzzy Sets and the Analytical Hierarchy Process to Decision Making.

    DTIC Science & Technology

    1984-09-01

    presented in Figure 1.1. Another point of view is that of Kellerman [Ref. 2] whose analysis is related with the conflicts, needs and personality traits of...iii). The personality traits that influence the decision making process and differ in each iLdividual are the following: His tclerance for ambiguity or...a person with high tolerance for ambiguity is more likely to be patient in evaluating or collecting information before taking action. Cn the contrary

  20. Users guide to E859 phoswich analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costales, J.B.

    1992-11-30

    In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less

  1. Multiview 3D sensing and analysis for high quality point cloud reconstruction

    NASA Astrophysics Data System (ADS)

    Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard

    2018-04-01

    Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.

  2. A comparative study of Conroy and Monte Carlo methods applied to multiple quadratures and multiple scattering

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Fluellen, A.

    1978-01-01

    An efficient numerical method of multiple quadratures, the Conroy method, is applied to the problem of computing multiple scattering contributions in the radiative transfer through realistic planetary atmospheres. A brief error analysis of the method is given and comparisons are drawn with the more familiar Monte Carlo method. Both methods are stochastic problem-solving models of a physical or mathematical process and utilize the sampling scheme for points distributed over a definite region. In the Monte Carlo scheme the sample points are distributed randomly over the integration region. In the Conroy method, the sample points are distributed systematically, such that the point distribution forms a unique, closed, symmetrical pattern which effectively fills the region of the multidimensional integration. The methods are illustrated by two simple examples: one, of multidimensional integration involving two independent variables, and the other, of computing the second order scattering contribution to the sky radiance.

  3. Representation and display of vector field topology in fluid flow data sets

    NASA Technical Reports Server (NTRS)

    Helman, James; Hesselink, Lambertus

    1989-01-01

    The visualization of physical processes in general and of vector fields in particular is discussed. An approach to visualizing flow topology that is based on the physics and mathematics underlying the physical phenomenon is presented. It involves determining critical points in the flow where the velocity vector vanishes. The critical points, connected by principal lines or planes, determine the topology of the flow. The complexity of the data is reduced without sacrificing the quantitative nature of the data set. By reducing the original vector field to a set of critical points and their connections, a representation of the topology of a two-dimensional vector field that is much smaller than the original data set but retains with full precision the information pertinent to the flow topology is obtained. This representation can be displayed as a set of points and tangent curves or as a graph. Analysis (including algorithms), display, interaction, and implementation aspects are discussed.

  4. [The added value of information summaries supporting clinical decisions at the point-of-care.

    PubMed

    Banzi, Rita; González-Lorenzo, Marien; Kwag, Koren Hyogene; Bonovas, Stefanos; Moja, Lorenzo

    2016-11-01

    Evidence-based healthcare requires the integration of the best research evidence with clinical expertise and patients' values. International publishers are developing evidence-based information services and resources designed to overcome the difficulties in retrieving, assessing and updating medical information as well as to facilitate a rapid access to valid clinical knowledge. Point-of-care information summaries are defined as web-based medical compendia that are specifically designed to deliver pre-digested, rapidly accessible, comprehensive, and periodically updated information to health care providers. Their validity must be assessed against marketing claims that they are evidence-based. We periodically evaluate the content development processes of several international point-of-care information summaries. The number of these products has increased along with their quality. The last analysis done in 2014 identified 26 products and found that three of them (Best Practice, Dynamed e Uptodate) scored the highest across all evaluated dimensions (volume, quality of the editorial process and evidence-based methodology). Point-of-care information summaries as stand-alone products or integrated with other systems, are gaining ground to support clinical decisions. The choice of one product over another depends both on the properties of the service and the preference of users. However, even the most innovative information system must rely on transparent and valid contents. Individuals and institutions should regularly assess the value of point-of-care summaries as their quality changes rapidly over time.

  5. The importance of topographically corrected null models for analyzing ecological point processes.

    PubMed

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  6. Impulse excitation scanning acoustic microscopy for local quantification of Rayleigh surface wave velocity using B-scan analysis

    NASA Astrophysics Data System (ADS)

    Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.

    2018-01-01

    A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.

  7. Kinetic Simulations of Type II Radio Burst Emission Processes

    NASA Astrophysics Data System (ADS)

    Ganse, U.; Spanier, F. A.; Vainio, R. O.

    2011-12-01

    The fundamental emission process of Type II Radio Bursts has been under discussion for many decades. While analytic deliberations point to three wave interaction as the source for fundamental and harmonic radio emissions, sparse in-situ observational data and high computational demands for kinetic simulations have not allowed for a definite conclusion to be reached. A popular model puts the radio emission into the foreshock region of a coronal mass ejection's shock front, where shock drift acceleration can create eletrcon beam populations in the otherwise quiescent foreshock plasma. Beam-driven instabilities are then assumed to create waves, forming the starting point of three wave interaction processes. Using our kinetic particle-in-cell code, we have studied a number of emission scenarios based on electron beam populations in a CME foreshock, with focus on wave-interaction microphysics on kinetic scales. The self-consistent, fully kinetic simulations with completely physical mass-ratio show fundamental and harmonic emission of transverse electromagnetic waves and allow for detailled statistical analysis of all contributing wavemodes and their couplings.

  8. Complementary experimental-simulational study of surfactant micellar phase in the extraction process of metallic ions: Effects of temperature and salt concentration

    NASA Astrophysics Data System (ADS)

    Soto-Ángeles, Alan Gustavo; Rodríguez-Hidalgo, María del Rosario; Soto-Figueroa, César; Vicente, Luis

    2018-02-01

    The thermoresponsive micellar phase behaviour that exhibits the Triton-X-100 micelles by temperature effect and addition of salt in the extraction process of metallic ions was explored from mesoscopic and experimental points. In the theoretical study, we analyse the formation of Triton-X-100 micelles, load and stabilization of dithizone molecules and metallic ions extraction inside the micellar core at room temperature; finally, a thermal analysis is presented. In the experimental study, the spectrophotometric outcomes confirm the solubility of the copper-dithizone complex in the micellar core, as well as the extraction of metallic ions of aqueous environment via a cloud-point at 332.2 K. The micellar solutions with salt present a low absorbance value compared with the micellar solutions without salt. The decrease in the absorbance value is attributed to a change in the size of hydrophobic region of colloidal micelles. All transitory stages of extraction process are discussed and analysed in this document.

  9. Conceptual design of the CZMIL data processing system (DPS): algorithms and software for fusing lidar, hyperspectral data, and digital images

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong; Tuell, Grady

    2010-04-01

    The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera data in a single software package. These new products significantly transcend use of the system as a bathymeter, and support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs, and reflectance images at a 1:1 processing to acquisition ratio.

  10. Development and Demonstration of an Ada Test Generation System

    NASA Technical Reports Server (NTRS)

    1996-01-01

    In this project we have built a prototype system that performs Feasible Path Analysis on Ada programs: given a description of a set of control flow paths through a procedure, and a predicate at a program point feasible path analysis determines if there is input data which causes execution to flow down some path in the collection reaching the point so that tile predicate is true. Feasible path analysis can be applied to program testing, program slicing, array bounds checking, and other forms of anomaly checking. FPA is central to most applications of program analysis. But, because this problem is formally unsolvable, syntactic-based approximations are used in its place. For example, in dead-code analysis the problem is to determine if there are any input values which cause execution to reach a specified program point. Instead an approximation to this problem is computed: determine whether there is a control flow path from the start of the program to the point. This syntactic approximation is efficiently computable and conservative: if there is no such path the program point is clearly unreachable, but if there is such a path, the analysis is inconclusive, and the code is assumed to be live. Such conservative analysis too often yields unsatisfactory results because the approximation is too weak. As another example, consider data flow analysis. A du-pair is a pair of program points such that the first point is a definition of a variable and the second point a use and for which there exists a definition-free path from the definition to the use. The sharper, semantic definition of a du-pair requires that there be a feasible definition-free path from the definition to the use. A compiler using du-pairs for detecting dead variables may miss optimizations by not considering feasibility. Similarly, a program analyzer computing program slices to merge parallel versions may report conflicts where none exist. In the context of software testing, feasibility analysis plays an important role in identifying testing requirements which are infeasible. This is especially true for data flow testing and modified condition/decision coverage. Our system uses in an essential way symbolic analysis and theorem proving technology, and we believe this work represents one of the few successful uses of a theorem prover working in a completely automatic fashion to solve a problem of practical interest. We believe this work anticipates an important trend away from purely syntactic-based methods for program analysis to semantic methods based on symbolic processing and inference technology. Other results demonstrating the practical use of automatic inference is being reported in hardware verification, although there are significant differences between the hardware work and ours. However, what is common and important is that general purpose theorem provers are being integrated with more special-purpose decision procedures to solve problems in analysis and verification. We are pursuina commercial opportunities for this work, and will use and extend the work in other projects we are engaged in. Ultimately we would like to rework the system to analyze C, C++, or Java as a key step toward commercialization.

  11. Model averaging in linkage analysis.

    PubMed

    Matthysse, Steven

    2006-06-05

    Methods for genetic linkage analysis are traditionally divided into "model-dependent" and "model-independent," but there may be a useful place for an intermediate class, in which a broad range of possible models is considered as a parametric family. It is possible to average over model space with an empirical Bayes prior that weights models according to their goodness of fit to epidemiologic data, such as the frequency of the disease in the population and in first-degree relatives (and correlations with other traits in the pleiotropic case). For averaging over high-dimensional spaces, Markov chain Monte Carlo (MCMC) has great appeal, but it has a near-fatal flaw: it is not possible, in most cases, to provide rigorous sufficient conditions to permit the user safely to conclude that the chain has converged. A way of overcoming the convergence problem, if not of solving it, rests on a simple application of the principle of detailed balance. If the starting point of the chain has the equilibrium distribution, so will every subsequent point. The first point is chosen according to the target distribution by rejection sampling, and subsequent points by an MCMC process that has the target distribution as its equilibrium distribution. Model averaging with an empirical Bayes prior requires rapid estimation of likelihoods at many points in parameter space. Symbolic polynomials are constructed before the random walk over parameter space begins, to make the actual likelihood computations at each step of the random walk very fast. Power analysis in an illustrative case is described. (c) 2006 Wiley-Liss, Inc.

  12. Doubly stochastic Poisson process models for precipitation at fine time-scales

    NASA Astrophysics Data System (ADS)

    Ramesh, Nadarajah I.; Onof, Christian; Xie, Dichao

    2012-09-01

    This paper considers a class of stochastic point process models, based on doubly stochastic Poisson processes, in the modelling of rainfall. We examine the application of this class of models, a neglected alternative to the widely-known Poisson cluster models, in the analysis of fine time-scale rainfall intensity. These models are mainly used to analyse tipping-bucket raingauge data from a single site but an extension to multiple sites is illustrated which reveals the potential of this class of models to study the temporal and spatial variability of precipitation at fine time-scales.

  13. PAT-tools for process control in pharmaceutical film coating applications.

    PubMed

    Knop, Klaus; Kleinebudde, Peter

    2013-12-05

    Recent development of analytical techniques to monitor the coating process of pharmaceutical solid dosage forms such as pellets and tablets are described. The progress from off- or at-line measurements to on- or in-line applications is shown for the spectroscopic methods near infrared (NIR) and Raman spectroscopy as well as for terahertz pulsed imaging (TPI) and image analysis. The common goal of all these methods is to control or at least to monitor the coating process and/or to estimate the coating end point through timely measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Intelligent control system for continuous technological process of alkylation

    NASA Astrophysics Data System (ADS)

    Gebel, E. S.; Hakimov, R. A.

    2018-01-01

    Relevance of intelligent control for complex dynamic objects and processes are shown in this paper. The model of a virtual analyzer based on a neural network is proposed. Comparative analysis of mathematical models implemented in MathLab software showed that the most effective from the point of view of the reproducibility of the result is the model with seven neurons in the hidden layer, the training of which was performed using the method of scaled coupled gradients. Comparison of the data from the laboratory analysis and the theoretical model are showed that the root-mean-square error does not exceed 3.5, and the calculated value of the correlation coefficient corresponds to a "strong" connection between the values.

  15. Application of Failure Mode and Effect Analysis (FMEA) and cause and effect analysis in conjunction with ISO 22000 to a snails (Helix aspersa) processing plant; A case study.

    PubMed

    Arvanitoyannis, Ioannis S; Varzakas, Theodoros H

    2009-08-01

    Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.

  16. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. [Multi-channel in vivo recording techniques: signal processing of action potentials and local field potentials].

    PubMed

    Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian

    2014-06-25

    Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.

  18. The use of image analysis in evaluation of the fibers orientation in Wood-polymer composites (WPC)

    NASA Astrophysics Data System (ADS)

    Bednarz, Arkadiusz; Frącz, Wiesław; Janowski, Grzegorz

    2016-12-01

    In this paper a novel way of a digital analysis of fibers orientation with a five-step algorithmwas presented. In the study, a molded piece with a dumbbell shape prepared from wood-polymer composite was used. The injection molding process was examined in experimental and numerical way. Based on the developed mathematical algorithm, a significant compliance of fiber orientation in different areas of the molded piece was obtained. The main aim of thisworkwas fiber orientation analysis of wood-polymer composites. An additional goal of thiswork was the comparison of the results reached in numerical analysis with results obtained from an experiment. The results of this research were important for the scientific and also from the practical point of view. In future works the prepared algorithm could be used to reach optimal parameters of the injection molding process.

  19. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  20. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

Top