Sample records for tool point frf

  1. Swept sine testing of rotor-bearing system for damping estimation

    NASA Astrophysics Data System (ADS)

    Chandra, N. Harish; Sekhar, A. S.

    2014-01-01

    Many types of rotating components commonly operate above the first or second critical speed and they are subjected to run-ups and shutdowns frequently. The present study focuses on developing FRF of rotor bearing systems for damping estimation from swept-sine excitation. The principle of active vibration control states that with increase in angular acceleration, the amplitude of vibration due to unbalance will reduce and the FRF envelope will shift towards the right (or higher frequency). The frequency response function (FRF) estimated by tracking filters or Co-Quad analyzers was proved to induce an error into the FRF estimate. Using Fast Fourier Transform (FFT) algorithm and stationary wavelet transform (SWT) decomposition FRF distortion can be reduced. To obtain a theoretical clarity, the shifting of FRF envelope phenomenon is incorporated into conventional FRF expressions and validation is performed with the FRF estimated using the Fourier Transform approach. The half-power bandwidth method is employed to extract damping ratios from the FRF estimates. While deriving half-power points for both types of responses (acceleration and displacement), damping ratio (ζ) is estimated with different approximations like classical definition (neglecting damping ratio of order higher than 2), third order (neglecting damping ratios with order higher than 4) and exact (no assumptions on damping ratio). The use of stationary wavelet transform to denoise the noise corrupted FRF data is explained. Finally, experiments are performed on a test rotor excited with different sweep rates to estimate the damping ratio.

  2. Dominant modal decomposition method

    NASA Astrophysics Data System (ADS)

    Dombovari, Zoltan

    2017-03-01

    The paper deals with the automatic decomposition of experimental frequency response functions (FRF's) of mechanical structures. The decomposition of FRF's is based on the Green function representation of free vibratory systems. After the determination of the impulse dynamic subspace, the system matrix is formulated and the poles are calculated directly. By means of the corresponding eigenvectors, the contribution of each element of the impulse dynamic subspace is determined and the sufficient decomposition of the corresponding FRF is carried out. With the presented dominant modal decomposition (DMD) method, the mode shapes, the modal participation vectors and the modal scaling factors are identified using the decomposed FRF's. Analytical example is presented along with experimental case studies taken from machine tool industry.

  3. Chatter reduction in boring process by using piezoelectric shunt damping with experimental verification

    NASA Astrophysics Data System (ADS)

    Yigit, Ufuk; Cigeroglu, Ender; Budak, Erhan

    2017-09-01

    Chatter is a self-excited type of vibration that develops during machining due to process-structure dynamic interactions resulting in modulated chip thickness. Chatter is an important problem as it results in poor surface quality, reduced productivity and tool life. The stability of a cutting process is strongly influenced by the frequency response function (FRF) at the cutting point. In this study, the effect of piezoelectric shunt damping on chatter vibrations in a boring process is studied. In piezoelectric shunt damping method, an electrical impedance is connected to a piezoelectric transducer which is bonded on cutting tool. Electrical impedance of the circuit consisting of piezoceramic transducer and passive shunt is tuned to the desired natural frequency of the cutting tool in order to maximize damping. The optimum damping is achieved in analytical and finite element models (FEM) by using a genetic algorithm focusing on the real part of the tool point FRF rather than the amplitude. Later, a practical boring bar is considered where the optimum circuit parameters are obtained by the FEM. Afterwards, the effect of the optimized piezoelectric shunt damping on the dynamic rigidity and absolute stability limit of the cutting process are investigated experimentally by modal analysis and cutting tests. It is both theoretically and experimentally shown that application of piezoelectric shunt damping results in a significant increase in the absolute stability limit in boring operations.

  4. A new frequency matching technique for FRF-based model updating

    NASA Astrophysics Data System (ADS)

    Yang, Xiuming; Guo, Xinglin; Ouyang, Huajiang; Li, Dongsheng

    2017-05-01

    Frequency Response Function (FRF) residues have been widely used to update Finite Element models. They are a kind of original measurement information and have the advantages of rich data and no extraction errors, etc. However, like other sensitivity-based methods, an FRF-based identification method also needs to face the ill-conditioning problem which is even more serious since the sensitivity of the FRF in the vicinity of a resonance is much greater than elsewhere. Furthermore, for a given frequency measurement, directly using a theoretical FRF at a frequency may lead to a huge difference between the theoretical FRF and the corresponding experimental FRF which finally results in larger effects of measurement errors and damping. Hence in the solution process, correct selection of the appropriate frequency to get the theoretical FRF in every iteration in the sensitivity-based approach is an effective way to improve the robustness of an FRF-based algorithm. A primary tool for right frequency selection based on the correlation of FRFs is the Frequency Domain Assurance Criterion. This paper presents a new frequency selection method which directly finds the frequency that minimizes the difference of the order of magnitude between the theoretical and experimental FRFs. A simulated truss structure is used to compare the performance of different frequency selection methods. For the sake of reality, it is assumed that not all the degrees of freedom (DoFs) are available for measurement. The minimum number of DoFs required in each approach to correctly update the analytical model is regarded as the right identification standard.

  5. Fitting Flux Ropes to a Global MHD Solution: A Comparison of Techniques. Appendix 1

    NASA Technical Reports Server (NTRS)

    Riley, Pete; Linker, J. A.; Lionello, R.; Mikic, Z.; Odstrcil, D.; Hidalgo, M. A.; Cid, C.; Hu, Q.; Lepping, R. P.; Lynch, B. J.

    2004-01-01

    Flux rope fitting (FRF) techniques are an invaluable tool for extracting information about the properties of a subclass of CMEs in the solar wind. However, it has proven difficult to assess their accuracy since the underlying global structure of the CME cannot be independently determined from the data. In contrast, large-scale MHD simulations of CME evolution can provide both a global view as well as localized time series at specific points in space. In this study we apply 5 different fitting techniques to 2 hypothetical time series derived from MHD simulation results. Independent teams performed the analysis of the events in "blind tests", for which no information, other than the time series, was provided. F rom the results, we infer the following: (1) Accuracy decreases markedly with increasingly glancing encounters; (2) Correct identification of the boundaries of the flux rope can be a significant limiter; and (3) Results from techniques that infer global morphology must be viewed with caution. In spite of these limitations, FRF techniques remain a useful tool for describing in situ observations of flux rope CMEs.

  6. A new time-independent formulation of fractional release

    NASA Astrophysics Data System (ADS)

    Ostermöller, Jennifer; Bönisch, Harald; Jöckel, Patrick; Engel, Andreas

    2017-03-01

    The fractional release factor (FRF) gives information on the amount of a halocarbon that is released at some point into the stratosphere from its source form to the inorganic form, which can harm the ozone layer through catalytic reactions. The quantity is of major importance because it directly affects the calculation of the ozone depletion potential (ODP). In this context time-independent values are needed which, in particular, should be independent of the trends in the tropospheric mixing ratios (tropospheric trends) of the respective halogenated trace gases. For a given atmospheric situation, such FRF values would represent a molecular property.We analysed the temporal evolution of FRF from ECHAM/MESSy Atmospheric Chemistry (EMAC) model simulations for several halocarbons and nitrous oxide between 1965 and 2011 on different mean age levels and found that the widely used formulation of FRF yields highly time-dependent values. We show that this is caused by the way that the tropospheric trend is handled in the widely used calculation method of FRF.Taking into account chemical loss in the calculation of stratospheric mixing ratios reduces the time dependence in FRFs. Therefore we implemented a loss term in the formulation of the FRF and applied the parameterization of a mean arrival time to our data set.We find that the time dependence in the FRF can almost be compensated for by applying a new trend correction in the calculation of the FRF. We suggest that this new method should be used to calculate time-independent FRFs, which can then be used e.g. for the calculation of ODP.

  7. [The financial impact of maintenance treatment in heroin addictive behavior: the case of Subutex].

    PubMed

    Kopp, P; Rumeau-Pichon, C; Le Pen, C

    2000-06-01

    The development of maintenance treatment for subjects with addictive behavior is an important public health issue. As such, the social effectiveness of maintenance products must be examined from an economical and social point of view. This paper aims at presenting the financial costs involved in the use of Subutex, a product commercialized since 1996. A complete typology of costs related to drug addiction and its consequences was set up. Some of these costs were estimated on the basis of data drawn from the literature. The cost of Subutex use for maintenance treatment was assessed and compared with the financial stakes including the potential reduction of the economic and social cost of drug addiction. Monthly treatment cost of Subutex was 1252 FrF per drug abuser on maintenance treatment. By extrapolation, for a population of 40,000 drug abusers, the direct medical cost of Subutex during a course of maintenance treatment with general practitioner follow-up was estimated at 600 millions FrF. US data sources were applied to France to assess the cost of illnesses attributable to drug addiction. The cost reached 4.8 billions FrF. The cost of delinquency associated with drug addiction, which mostly concerns money laundered to purchase substances was an estimated 6.4 billions FrF. Finally, the cost of public anti-drug abuse programs was nearly 4.7 billions FrF. Thus, the direct cost of drug addiction consequences reached 15.6 billions FrF. This cost should be compared with the annual cost of Subutex for public organizations which was an estimated 600 millions FrF. The "profit" threshold of maintenance treatment with Subutex in terms of direct costs is very low. A decrease of only 4% of the costs associated with drug addiction would make it possible to balance the financial budget for the community. Our analysis does not take into acount absolutely all the public health and safety aspects involved in the use of Subutex. It does however provide a useful assessment of the financial aspects of the question and justification for this therapeutic strategy from a budgetary point of view.

  8. Effects of modal truncation and condensation methods on the Frequency Response Function of a stage reducer connected by rigid coupling to a planetary gear system

    NASA Astrophysics Data System (ADS)

    Bouslema, Marwa; Frikha, Ahmed; Abdennadhar, Moez; Fakhfakh, Tahar; Nasri, Rachid; Haddar, Mohamed

    2017-12-01

    The present paper is aimed at the application of a substructure methodology, based on the Frequency Response Function (FRF) simulation technique, to analyze the vibration of a stage reducer connected by a rigid coupling to a planetary gear system. The computation of the vibration response was achieved using the FRF-based substructuring method. First of all, the two subsystems were analyzed separately and their FRF were obtained. Then the coupled model was analyzed indirectly using the substructuring technique. A comparison between the full system response and the coupled model response using the FRF substructuring was investigated to validate the coupling method. Furthermore, a parametric study of the effect of the shaft coupling stiffness on the FRF was discussed and the effects of modal truncation and condensation methods on the FRF of subsystems were analyzed.

  9. Frequency response function (FRF) based updating of a laser spot welded structure

    NASA Astrophysics Data System (ADS)

    Zin, M. S. Mohd; Rani, M. N. Abdul; Yunus, M. A.; Sani, M. S. M.; Wan Iskandar Mirza, W. I. I.; Mat Isa, A. A.

    2018-04-01

    The objective of this paper is to present frequency response function (FRF) based updating as a method for matching the finite element (FE) model of a laser spot welded structure with a physical test structure. The FE model of the welded structure was developed using CQUAD4 and CWELD element connectors, and NASTRAN was used to calculate the natural frequencies, mode shapes and FRF. Minimization of the discrepancies between the finite element and experimental FRFs was carried out using the exceptional numerical capability of NASTRAN Sol 200. The experimental work was performed under free-free boundary conditions using LMS SCADAS. Avast improvement in the finite element FRF was achieved using the frequency response function (FRF) based updating with two different objective functions proposed.

  10. Accurate Determination of the Frequency Response Function of Submerged and Confined Structures by Using PZT-Patches†.

    PubMed

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme; Egusquiza, Mònica; Bossio, Matias

    2017-03-22

    To accurately determine the dynamic response of a structure is of relevant interest in many engineering applications. Particularly, it is of paramount importance to determine the Frequency Response Function (FRF) for structures subjected to dynamic loads in order to avoid resonance and fatigue problems that can drastically reduce their useful life. One challenging case is the experimental determination of the FRF of submerged and confined structures, such as hydraulic turbines, which are greatly affected by dynamic problems as reported in many cases in the past. The utilization of classical and calibrated exciters such as instrumented hammers or shakers to determine the FRF in such structures can be very complex due to the confinement of the structure and because their use can disturb the boundary conditions affecting the experimental results. For such cases, Piezoelectric Patches (PZTs), which are very light, thin and small, could be a very good option. Nevertheless, the main drawback of these exciters is that the calibration as dynamic force transducers (relationship voltage/force) has not been successfully obtained in the past. Therefore, in this paper, a method to accurately determine the FRF of submerged and confined structures by using PZTs is developed and validated. The method consists of experimentally determining some characteristic parameters that define the FRF, with an uncalibrated PZT exciting the structure. These parameters, which have been experimentally determined, are then introduced in a validated numerical model of the tested structure. In this way, the FRF of the structure can be estimated with good accuracy. With respect to previous studies, where only the natural frequencies and mode shapes were considered, this paper discuss and experimentally proves the best excitation characteristic to obtain also the damping ratios and proposes a procedure to fully determine the FRF. The method proposed here has been validated for the structure vibrating in air comparing the FRF experimentally obtained with a calibrated exciter (impact Hammer) and the FRF obtained with the described method. Finally, the same methodology has been applied for the structure submerged and close to a rigid wall, where it is extremely important to not modify the boundary conditions for an accurate determination of the FRF. As experimentally shown in this paper, in such cases, the use of PZTs combined with the proposed methodology gives much more accurate estimations of the FRF than other calibrated exciters typically used for the same purpose. Therefore, the validated methodology proposed in this paper can be used to obtain the FRF of a generic submerged and confined structure, without a previous calibration of the PZT.

  11. Accurate Determination of the Frequency Response Function of Submerged and Confined Structures by Using PZT-Patches †

    PubMed Central

    Presas, Alexandre; Valentin, David; Egusquiza, Eduard; Valero, Carme; Egusquiza, Mònica; Bossio, Matias

    2017-01-01

    To accurately determine the dynamic response of a structure is of relevant interest in many engineering applications. Particularly, it is of paramount importance to determine the Frequency Response Function (FRF) for structures subjected to dynamic loads in order to avoid resonance and fatigue problems that can drastically reduce their useful life. One challenging case is the experimental determination of the FRF of submerged and confined structures, such as hydraulic turbines, which are greatly affected by dynamic problems as reported in many cases in the past. The utilization of classical and calibrated exciters such as instrumented hammers or shakers to determine the FRF in such structures can be very complex due to the confinement of the structure and because their use can disturb the boundary conditions affecting the experimental results. For such cases, Piezoelectric Patches (PZTs), which are very light, thin and small, could be a very good option. Nevertheless, the main drawback of these exciters is that the calibration as dynamic force transducers (relationship voltage/force) has not been successfully obtained in the past. Therefore, in this paper, a method to accurately determine the FRF of submerged and confined structures by using PZTs is developed and validated. The method consists of experimentally determining some characteristic parameters that define the FRF, with an uncalibrated PZT exciting the structure. These parameters, which have been experimentally determined, are then introduced in a validated numerical model of the tested structure. In this way, the FRF of the structure can be estimated with good accuracy. With respect to previous studies, where only the natural frequencies and mode shapes were considered, this paper discuss and experimentally proves the best excitation characteristic to obtain also the damping ratios and proposes a procedure to fully determine the FRF. The method proposed here has been validated for the structure vibrating in air comparing the FRF experimentally obtained with a calibrated exciter (impact Hammer) and the FRF obtained with the described method. Finally, the same methodology has been applied for the structure submerged and close to a rigid wall, where it is extremely important to not modify the boundary conditions for an accurate determination of the FRF. As experimentally shown in this paper, in such cases, the use of PZTs combined with the proposed methodology gives much more accurate estimations of the FRF than other calibrated exciters typically used for the same purpose. Therefore, the validated methodology proposed in this paper can be used to obtain the FRF of a generic submerged and confined structure, without a previous calibration of the PZT. PMID:28327501

  12. Innovative FRF measurement technique for frequency based substructuring method

    NASA Astrophysics Data System (ADS)

    Mirza, W. I. I. Wan Iskandar; Rani, M. N. Abdul; Ayub, M. A.; Yunus, M. A.; Omar, R.; Mohd Zin, M. S.

    2018-04-01

    In this paper, frequency based substructuring (FBS) is used in an attempt to predict the dynamic behaviour of an assembled structure. The assembled structure which consists of two beam substructures namely substructure A (finite element model) and substructure B (experimental model) was tested. The FE model of substructure A was constructed by using 3D elements and the Frequency Response Functions (FRFs) were derived viaa FRF synthesis method. A specially customised bolt was used to allow the attachment of sensors and excitation to be made at theinterfaces of substructure B, and the FRFs were measured by using an impact testing method. Both substructures A and B were then coupled by using the FBS method for the prediction of FRFs. The coupled FRF obtained was validated with the measured FRF counterparts. This work revealed that by implementing a specially customised bolt during the measurement of FRF at the interface, led to an improvement in the FBS predicted results.

  13. Alterations in lipids & lipid peroxidation in rats fed with flavonoid rich fraction of banana (Musa paradisiaca) from high background radiation area.

    PubMed

    Krishnan, Kripa; Vijayalakshmi, N R

    2005-12-01

    A group of villages in Kollam district of Kerala, southern part of India are exposed to a higher dose of natural radiation than global average. Yet no adverse health effects have been found in humans, animals and plants in these areas. The present study was carried out to understand whether radiation affects the quantity and quality of flavonoids in plants grown in this area of high radiation, and to assess the effect of feeding flavonoid rich fraction (FRF) of the two varieties of banana to rats on their biochemical parameters like lipids, lipid peroxides and antioxidant enzyme levels. A total of 42 albino rats were equally divided into 7 groups. Rats fed laboratory diet alone were grouped under group I (normal control). Groups II and V received flavonoid rich fraction (FRF) from the fruits of two varieties of Musa paradisiaca, Palayamkodan and Rasakadali respectively from normal background radiation area (Veli) and treated as controls. Rats of groups III and IV received FRF of Palayamkodan from high background radiation areas (HBRAs) - Neendakara and Karunagappally respectively while groups VI and VII received FRF of Rasakadali from HBRAs. At the end of the experimental period of 45 days, lipids, lipid peroxides and antioxidant enzymes from liver, heart and kidney were analyzed. FRF of Palayamkodan and Rasakadali varieties showed significant hypolipidaemic and antioxidant activities. But these activities were found to be lowered in plants grown in HBRAs, particularly in Karunagappally area. Of the two, Palayamkodan variety was more effective in reducing lipids and lipid peroxides. MDA and hydroperoxides were significantly diminished in rats given FRF of banana from Veli (control area) only. FRF from plants grown in HBRAs exerted inhibition in the activities of antioxidant enzymes in the liver of rats and this inhibitory effect was maximum in rats fed FRF from Karunagappally. Banana grown in HBRAs is of lower quality with less efficient antioxidant system. Palayamkodan was superior with its effect on hypolipidaemic and antioxidant activities. High background radiation seems to have no enhancing effect on the radioprotective action of flavonoids of banana and thereby to those consuming these fruits.

  14. FRF decoupling of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Kalaycıoğlu, Taner; Özgüven, H. Nevzat

    2018-03-01

    Structural decoupling problem, i.e. predicting dynamic behavior of a particular substructure from the knowledge of the dynamics of the coupled structure and the other substructure, has been well investigated for three decades and led to several decoupling methods. In spite of the inherent nonlinearities in a structural system in various forms such as clearances, friction and nonlinear stiffness, all decoupling studies are for linear systems. In this study, decoupling problem for nonlinear systems is addressed for the first time. A method, named as FRF Decoupling Method for Nonlinear Systems (FDM-NS), is proposed for calculating FRFs of a substructure decoupled from a coupled nonlinear structure where nonlinearity can be modeled as a single nonlinear element. Depending on where nonlinear element is, i.e., either in the known or unknown subsystem, or at the connection point, the formulation differs. The method requires relative displacement information between two end points of the nonlinear element, in addition to point and transfer FRFs at some points of the known subsystem. However, it is not necessary to excite the system from the unknown subsystem even when the nonlinear element is in that subsystem. The validation of FDM-NS is demonstrated with two different case studies using nonlinear lumped parameter systems. Finally, a nonlinear experimental test structure is used in order to show the real-life application and accuracy of FDM-NS.

  15. Error analysis and new dual-cosine window for estimating the sensor frequency response function from the step response data

    NASA Astrophysics Data System (ADS)

    Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun

    2018-03-01

    Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.

  16. Malaysian brown seaweeds Sargassum siliquosum and Sargassum polycystum: Low density lipoprotein (LDL) oxidation, angiotensin converting enzyme (ACE), α-amylase, and α-glucosidase inhibition activities.

    PubMed

    Nagappan, Hemlatha; Pee, Poh Ping; Kee, Sandra Hui Yin; Ow, Ji Tsong; Yan, See Wan; Chew, Lye Yee; Kong, Kin Weng

    2017-09-01

    Two Malaysian brown seaweeds, Sargassum siliquosum and Sargassum polycystum were first extracted using methanol to get the crude extract (CE) and further fractionated to obtain fucoxanthin-rich fraction (FRF). Samples were evaluated for their phenolic, flavonoid, and fucoxanthin contents, as well as their inhibitory activities towards low density lipoprotein (LDL) oxidation, angiotensin converting enzyme (ACE), α-amylase, and α-glucosidase. In LDL oxidation assay, an increasing trend in antioxidant activity was observed as the concentration of FRF (0.04-0.2mg/mL) and CE (0.2-1.0mg/mL) increased, though not statistically significant. As for serum oxidation assay, significant decrease in antioxidant activity was observed as concentration of FRF increased, while CE showed no significant difference in inhibitory activity across the concentrations used. The IC 50 values for ACE inhibitory activity of CE (0.03-0.42mg/mL) were lower than that of FRF (0.94-1.53mg/mL). When compared to reference drug Voglibose (IC 50 value of 0.61mg/mL) in the effectiveness in inhibiting α-amylase, CE (0.58mg/mL) gave significantly lower IC 50 values while FRF (0.68-0.71mg/mL) had significantly higher IC 50 values. The α-glucosidase inhibitory activity of CE (IC 50 value of 0.57-0.69mg/mL) and FRF (IC 50 value of 0.50-0.53mg/mL) were comparable to that of reference drug (IC 50 value of 0.54mg/mL). Results had shown the potential of S. siliquosum and S. polycystum in reducing cardiovascular diseases related risk factors following their inhibitory activities on ACE, α-amylase and α-glucosidase. In addition, it is likelihood that FRF possessed antioxidant activity at low concentration level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Feather retention force in broilers ante-, peri-, and post-mortem as influenced by electrical and carbon dioxide stunning.

    PubMed

    Buhr, R J; Cason, J A; Rowland, G N

    1997-11-01

    Stunning and slaughter trials were conducted to evaluate the influence of stunning method (electrical 50 V alternating current, CO2 gas: 0 to 40% for 90 s or 40 to 60% for 30 s) on feather retention force (FRF) in commercial broilers. Feathers from the pectoral, sternal, and femoral feather tracts were sampled with a force gauge before stunning (ante-mortem) and contralaterally either after stunning (peri-mortem from 0.5 to 4 min) or after stunning and bleeding (post-mortem from 2 to 6 min). Prior to stunning, ante-mortem FRF values varied among assigned stunning methods only for the pectoral (7%) feather tract. After stunning, peri-mortem FRF values were higher only for the sternal tract (11% for 40 to 60% CO2 for 30 s); whereas after stunning and bleeding, post-mortem FRF values were lower than ante- or peri-mortem only for the sternal tract (10% lower for 40 to 60% CO2 for 30 s). Peri- and post-mortem FRF values did not differ among stunning methods for the pectoral and femoral feather tracts. Small changes in FRF values occurred from ante-mortem to peri-mortem (-1 to +12%), and from ante-mortem to post-mortem (-2 to +8%) across stunning methods. A significant increase was determined for only the pectoral tract (7%) from ante- to peri-mortem across stunning methods. Electrically stunned broilers that were not bled gained weight in excess of the 36 feathers removed (0.16%), apparently due to body surface water pickup during the brine-stunning process, whereas CO2-stunned broilers lost weight due to excretion of cloacal contents (-0.31 to -0.98%). The change in body weight among stunning methods was significant (P < 0.0233). Peri- and post-mortem FRF, in addition to bleed-out body weight loss, were not substantially influenced by electrical or CO2 stunning methods, and, therefore, carcass defeathering efficiency may not differ after scalding.

  18. Nonlinear frequency response based adaptive vibration controller design for a class of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Thenozhi, Suresh; Tang, Yu

    2018-01-01

    Frequency response functions (FRF) are often used in the vibration controller design problems of mechanical systems. Unlike linear systems, the FRF derivation for nonlinear systems is not trivial due to their complex behaviors. To address this issue, the convergence property of nonlinear systems can be studied using convergence analysis. For a class of time-invariant nonlinear systems termed as convergent systems, the nonlinear FRF can be obtained. The present paper proposes a nonlinear FRF based adaptive vibration controller design for a mechanical system with cubic damping nonlinearity and a satellite system. Here the controller gains are tuned such that a desired closed-loop frequency response for a band of harmonic excitations is achieved. Unlike the system with cubic damping, the satellite system is not convergent, therefore an additional controller is utilized to achieve the convergence property. Finally, numerical examples are provided to illustrate the effectiveness of the proposed controller.

  19. Improving Data Discovery, Access, and Analysis to More Than Three Decades of Oceanographic and Geomorphologic Observations

    NASA Astrophysics Data System (ADS)

    Forte, M.; Hesser, T.; Knee, K.; Ingram, I.; Hathaway, K. K.; Brodie, K. L.; Spore, N.; Bird, A.; Fratantonio, R.; Dopsovic, R.; Keith, A.; Gadomski, K.

    2016-02-01

    The U.S. Army Engineer Research and Development Center's (USACE ERDC) Coastal and Hydraulics Laboratory (CHL) Coastal Observations and Analysis Branch (COAB) Measurements Program has a 35-year record of coastal observations. These datasets include oceanographic point source measurements, Real-Time Kinematic (RTK) GPS bathymetry surveys, and remote sensing data from both the Field Research Facility (FRF) in Duck, NC and from other project and experiment sites around the nation. The data has been used to support a variety of USACE mission areas, including coastal wave model development, beach and bar response, coastal project design, coastal storm surge, and other coastal hazard investigations. Furthermore these data have been widely used by a number of federal and state agencies, academic institutions, and private industries in hundreds of scientific and engineering investigations, publications, conference presentations and model advancement studies. A limiting factor to the use of FRF data has been rapid, reliable access and publicly available metadata for each data type. The addition of web tools, accessible data files, and well-documented metadata will open the door to much future collaboration. With the help of industry partner RPS ASA and the U.S. Army Corps of Engineers Mobile District Spatial Data Branch, a Data Integration Framework (DIF) was developed. The DIF represents a combination of processes, standards, people, and tools used to transform disconnected enterprise data into useful, easily accessible information for analysis and reporting. A front-end data portal connects the user to the framework that integrates both oceanographic observation and geomorphology measurements using a combination of ESRI and open-source technology while providing a seamless data discovery, access, and analysis experience to the user. The user interface was built with ESRI's JavaScript API and all project metadata is managed using Geoportal. The geomorphology data is made available through ArcGIS Server, while the oceanographic data sets have been formatted to netCDF4 and made available through a THREDDS server. Additional web tools run alongside the THREDDS server to provide rapid statistical calculations and plotting, allowing for user defined data access and visualization.

  20. Modeling and characterization of an electromagnetic system for the estimation of Frequency Response Function of spindle

    NASA Astrophysics Data System (ADS)

    Tlalolini, David; Ritou, Mathieu; Rabréau, Clément; Le Loch, Sébastien; Furet, Benoit

    2018-05-01

    The paper presents an electromagnetic system that has been developed to measure the quasi-static and dynamic behavior of machine-tool spindle, at different spindle speeds. This system consists in four Pulse Width Modulation amplifiers and four electromagnets to produce magnetic forces of ± 190 N for the static mode and ± 80 N for the dynamic mode up to 5 kHz. In order to measure the Frequency Response Function (FRF) of spindle, the applied force is required, which is a key issue. A dynamic force model is proposed in order to obtain the load from the measured current in the amplifiers. The model depends on the exciting frequency and on the magnetic characteristics of the system. The predicted force at high speed is validated with a specific experiment and the performance limits of the experimental device are investigated. The FRF obtained with the electromagnetic system is compared to a classical tap test measurement.

  1. Free-Suspension Residual Flexibility Testing of Space Station Pathfinder: Comparison to Fixed-Base Results

    NASA Technical Reports Server (NTRS)

    Tinker, Michael L.

    1998-01-01

    Application of the free-suspension residual flexibility modal test method to the International Space Station Pathfinder structure is described. The Pathfinder, a large structure of the general size and weight of Space Station module elements, was also tested in a large fixed-base fixture to simulate Shuttle Orbiter payload constraints. After correlation of the Pathfinder finite element model to residual flexibility test data, the model was coupled to a fixture model, and constrained modes and frequencies were compared to fixed-base test. modes. The residual flexibility model compared very favorably to results of the fixed-base test. This is the first known direct comparison of free-suspension residual flexibility and fixed-base test results for a large structure. The model correlation approach used by the author for residual flexibility data is presented. Frequency response functions (FRF) for the regions of the structure that interface with the environment (a test fixture or another structure) are shown to be the primary tools for model correlation that distinguish or characterize the residual flexibility approach. A number of critical issues related to use of the structure interface FRF for correlating the model are then identified and discussed, including (1) the requirement of prominent stiffness lines, (2) overcoming problems with measurement noise which makes the antiresonances or minima in the functions difficult to identify, and (3) the use of interface stiffness and lumped mass perturbations to bring the analytical responses into agreement with test data. It is shown that good comparison of analytical-to-experimental FRF is the key to obtaining good agreement of the residual flexibility values.

  2. Faraday rotation fluctuations of MESSENGER radio signals through the equatorial lower corona near solar minimum

    NASA Astrophysics Data System (ADS)

    Wexler, D. B.; Jensen, E. A.; Hollweg, J. V.; Heiles, C.; Efimov, A. I.; Vierinen, J.; Coster, A. J.

    2017-02-01

    Faraday rotation (FR) of transcoronal radio transmissions from spacecraft near superior conjunction enables study of the temporal variations in coronal plasma density, velocity, and magnetic field. The MESSENGER spacecraft 8.4 GHz radio, transmitting through the corona with closest line-of-sight approach 1.63-1.89 solar radii and near-equatorial heliolatitudes, was recorded soon after the deep solar minimum of solar cycle 23. During egress from superior conjunction, FR gradually decreased, and an overlay of wave-like FR fluctuations (FRFs) with periods of hundreds to thousands of seconds was found. The FRF power spectrum was characterized by a power law relation, with the baseline spectral index being -2.64. A transient power increase showed relative flattening of the spectrum and bands of enhanced spectral power at 3.3 mHz and 6.1 mHz. Our results confirm the presence of coronal FRF similar to those described previously at greater solar offset. Interpreted as Alfvén waves crossing the line of sight radially near the proximate point, low-frequency FRF convey an energy flux density higher than that of the background solar wind kinetic energy, but only a fraction of that required to accelerate the solar wind. Even so, this fraction is quite variable and potentially escalates to energetically significant values with relatively modest changes in estimated magnetic field strength and electron concentration. Given the uncertainties in these key parameters, as well as in solar wind properties close to the Sun at low heliolatitudes, we cannot yet confidently assign the quantitative role for Alfvén wave energy from this region in driving the slow solar wind.

  3. Prototyping of automotive components with variable width and depth

    NASA Astrophysics Data System (ADS)

    Abeyrathna, B.; Rolfe, B.; Harrasser, J.; Sedlmaier, A.; Ge, Rui; Pan, L.; Weiss, M.

    2017-09-01

    Roll forming enables the manufacturing of longitudinal components from materials that combine high strength with limited formability and is increasingly used in the automotive industry for the manufacture of structural and crash components. An extension of conventional roll forming is the Flexible Roll Forming (FRF) process where the rolls are no longer fixed in space but are free to move which enables the forming of components with variable cross section over the length of the part. Even though FRF components have high weight saving potential the technology has found only limited application in the automotive industry. A new flexible forming facility has recently been developed that enables proof of concept studies and the production of FRF prototypes before a full FRF line is built; this may lead to a wider uptake of the FRF technology in the automotive industry. In this process, the pre-cut blank is placed between two clamps and the whole set up moves back and forth; a forming roll that is mounted on a servo-controlled platform with six degrees of freedom forms the pre-cut blank to the desired shape. In this study an initial forming concept for the flexible roll forming of an automotive component with variable height is developed using COPRA® FEA RF. This is followed by performing experimental prototyping studies on the new concept forming facility. Using the optical strain measurement system Autogrid Compact, material deformation, part shape and wrinkling severity are analysed for some forming passes and compared with the numerical results. The results show that the numerical model gives a good representation of material behaviour and that with increasing forming severity wrinkling issues need to be overcome in the process.

  4. Modelling and tuning for a time-delayed vibration absorber with friction

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoxu; Xu, Jian; Ji, Jinchen

    2018-06-01

    This paper presents an integrated analytical and experimental study to the modelling and tuning of a time-delayed vibration absorber (TDVA) with friction. In system modelling, this paper firstly applies the method of averaging to obtain the frequency response function (FRF), and then uses the derived FRF to evaluate the fitness of different friction models. After the determination of the system model, this paper employs the obtained FRF to evaluate the vibration absorption performance with respect to tunable parameters. A significant feature of the TDVA with friction is that its stability is dependent on the excitation parameters. To ensure the stability of the time-delayed control, this paper defines a sufficient condition for stability estimation. Experimental measurements show that the dynamic response of the TDVA with friction can be accurately predicted and the time-delayed control can be precisely achieved by using the modelling and tuning technique provided in this paper.

  5. Estimating material viscoelastic properties based on surface wave measurements: A comparison of techniques and modeling assumptions

    PubMed Central

    Royston, Thomas J.; Dai, Zoujun; Chaunsali, Rajesh; Liu, Yifei; Peng, Ying; Magin, Richard L.

    2011-01-01

    Previous studies of the first author and others have focused on low audible frequency (<1 kHz) shear and surface wave motion in and on a viscoelastic material comprised of or representative of soft biological tissue. A specific case considered has been surface (Rayleigh) wave motion caused by a circular disk located on the surface and oscillating normal to it. Different approaches to identifying the type and coefficients of a viscoelastic model of the material based on these measurements have been proposed. One approach has been to optimize coefficients in an assumed viscoelastic model type to match measurements of the frequency-dependent Rayleigh wave speed. Another approach has been to optimize coefficients in an assumed viscoelastic model type to match the complex-valued frequency response function (FRF) between the excitation location and points at known radial distances from it. In the present article, the relative merits of these approaches are explored theoretically, computationally, and experimentally. It is concluded that matching the complex-valued FRF may provide a better estimate of the viscoelastic model type and parameter values; though, as the studies herein show, there are inherent limitations to identifying viscoelastic properties based on surface wave measurements. PMID:22225067

  6. Recent progress of colloidal quantum dot based solar cells

    NASA Astrophysics Data System (ADS)

    Wei, Huiyun; Li, Dongmei; Zheng, Xinhe; Meng, Qingbo

    2018-01-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 61274134, 91433205, 51372270, 51402348, 51421002, 21173260, 11474333, 51372272, and 51627803), the Knowledge Innovation Program of the Chinese Academy of Sciences, the Natural Science Foundation of Beijing, China (Grant No. 4173077), USTB Talent Program, China (Grant No. 06500053), and Fundamental Research Funds for the Central Universities, China (Grant Nos. FRF-BR-16-018A, FRF-TP-17-069A1, and 06198178).

  7. Impact of Fire Resistant Fuel Blends on Compression Ignition Engine Performance

    DTIC Science & Technology

    2011-07-01

    EFFECTS ON ENGINE PERFORMANCE FRF blends were tested in the CAT C7 and GEP 6.5L(T) engines to determine the effects of FRF on engine ...impact on efficiency of the Stanadyne rotary injection pump used in the GEP 6.5L(T) engine , thus largely effecting its power output when varying... exhaust backpressure .  Emissions are sampled from an exhaust probe installed between the engine and exhaust system butterfly valve. 

  8. Accurate frequency domain measurement of the best linear time-invariant approximation of linear time-periodic systems including the quantification of the time-periodic distortions

    NASA Astrophysics Data System (ADS)

    Louarroudi, E.; Pintelon, R.; Lataire, J.

    2014-10-01

    Time-periodic (TP) phenomena occurring, for instance, in wind turbines, helicopters, anisotropic shaft-bearing systems, and cardiovascular/respiratory systems, are often not addressed when classical frequency response function (FRF) measurements are performed. As the traditional FRF concept is based on the linear time-invariant (LTI) system theory, it is only approximately valid for systems with varying dynamics. Accordingly, the quantification of any deviation from this ideal LTI framework is more than welcome. The “measure of deviation” allows us to define the notion of the best LTI (BLTI) approximation, which yields the best - in mean square sense - LTI description of a linear time-periodic LTP system. By taking into consideration the TP effects, it is shown in this paper that the variability of the BLTI measurement can be reduced significantly compared with that of classical FRF estimators. From a single experiment, the proposed identification methods can handle (non-)linear time-periodic [(N)LTP] systems in open-loop with a quantification of (i) the noise and/or the NL distortions, (ii) the TP distortions and (iii) the transient (leakage) errors. Besides, a geometrical interpretation of the BLTI approximation is provided, leading to a framework called vector FRF analysis. The theory presented is supported by numerical simulations as well as real measurements mimicking the well-known mechanical Mathieu oscillator.

  9. Rheological, physico-sensory, nutritional and storage characteristics of bread enriched with roller milled fractions of black gram (Phaseolus mungo L.).

    PubMed

    Indrani, D; Sakhare, Suresh D; Milind; Inamdar, Aashitosh A

    2015-08-01

    Black gram grains were fractionated using roller flour mill. Effect of combination of additives (CA) namely dry gluten powder, sodium stearoyl-2-lactylate, fungal α-amylase on the rheological and bread making characteristics of wheat flour partly replaced with roller milled fractions of black gram was studied. With increase in the addition of straight run flour (SRF), protein rich fraction (PRF), protein and fiber rich fraction (P&FRF) from 0 to 20 %, fiber rich fraction, FRF (0-15 %), the farinograph water absorption increased and dough stability decreased; amylograph pasting temperature increased and peak viscosity decreased; bread volume decreased and crumb firmness value increased indicating adverse effect of these fractions on the rheological and bread making characteristics of wheat flour. Sensory evaluation showed that breads were acceptable only up to the level of 15 % for SRF, PRF & P&FRF and 10 % for FRF. However, when CA containing dry gluten powder, sodium stearoyl-2-lactylate and fungal α-amylase was incorporated the overall quality of the products improved. Use of these fractions increased the protein and fiber contents of bread by 1.24-1.66 and 1.48-3.79 times respectively. The results showed that possibility of utilising roller milled black gram fractions along with CA to improve the taste, texture and nutritional quality of bread.

  10. Influence of imperfect end boundary condition on the nonlocal dynamics of CNTs

    NASA Astrophysics Data System (ADS)

    Fathi, Reza; Lotfan, Saeed; Sadeghi, Morteza H.

    2017-03-01

    Imperfections that unavoidably occur during the fabrication process of carbon nanotubes (CNTs) have a significant influence on the vibration behavior of CNTs. Among these imperfections, the boundary condition defect is studied in this investigation based on the nonlocal elasticity theory. To this end, a mathematical model of the non-ideal end condition in a cantilever CNT is developed by a strongly non-linear spring to study its effect on the vibration behavior. The weak form equation of motion is derived via Hamilton's principle and solved based on Rayleigh-Ritz approach. Once the frequency response function (FRF) of the CNT is simulated, it is found that the defect parameter injects noise to the FRF in the range of lower frequencies and as a result the small scale effect on the FRF remains undisturbed in high frequency ranges. Besides, in this work a process is introduced to estimate the nonlocal and defect parameters for establishing the mathematical model of the CNT based on FRF, which can be competitive because of its lower instrumentation and data analysis costs. The estimation process relies on the resonance frequencies and the magnitude of noise in the frequency response function of the CNT. The results show that the constructed dynamic response of the system based on estimated parameters is in good agreement with the original response of the CNT.

  11. Enabling IP Header Compression in COTS Routers via Frame Relay on a Simplex Link

    NASA Technical Reports Server (NTRS)

    Nguyen, Sam P.; Pang, Jackson; Clare, Loren P.; Cheng, Michael K.

    2010-01-01

    NASA is moving toward a networkcentric communications architecture and, in particular, is building toward use of Internet Protocol (IP) in space. The use of IP is motivated by its ubiquitous application in many communications networks and in available commercial off-the-shelf (COTS) technology. The Constellation Program intends to fit two or more voice (over IP) channels on both the forward link to, and the return link from, the Orion Crew Exploration Vehicle (CEV) during all mission phases. Efficient bandwidth utilization of the links is key for voice applications. In Voice over IP (VoIP), the IP packets are limited to small sizes to keep voice latency at a minimum. The common voice codec used in VoIP is G.729. This new algorithm produces voice audio at 8 kbps and in packets of 10-milliseconds duration. Constellation has designed the VoIP communications stack to use the combination of IP/UDP/RTP protocols where IP carries a 20-byte header, UDP (User Datagram Protocol) carries an 8-byte header, and RTP (Real Time Transport Protocol) carries a 12-byte header. The protocol headers total 40 bytes and are equal in length to a 40-byte G.729 payload, doubling the VoIP latency. Since much of the IP/UDP/RTP header information does not change from IP packet to IP packet, IP/UDP/RTP header compression can avoid transmission of much redundant data as well as reduce VoIP latency. The benefits of IP header compression are more pronounced at low data rate links such as the forward and return links during CEV launch. IP/UDP/RTP header compression codecs are well supported by many COTS routers. A common interface to the COTS routers is through frame relay. However, enabling IP header compression over frame relay, according to industry standard (Frame Relay IP Header Compression Agreement FRF.20), requires a duplex link and negotiations between the compressor router and the decompressor router. In Constellation, each forward to and return link from the CEV in space is treated independently as a simplex link. Without negotiation, the COTS routers are prevented from entering into the IP header compression mode, and no IP header compression would be performed. An algorithm is proposed to enable IP header compression in COTS routers on a simplex link with no negotiation or with a one-way messaging. In doing so, COTS routers can enter IP header compression mode without the need to handshake through a bidirectional link as required by FRF.20. This technique would spoof the routers locally and thereby allow the routers to enter into IP header compression mode without having the negotiations between routers actually occur. The spoofing function is conducted by a frame relay adapter (also COTS) with the capability to generate control messages according to the FRF.20 descriptions. Therefore, negotiation is actually performed between the FRF.20 adapter and the connecting COTS router locally and never occurs over the space link. Through understanding of the handshaking protocol described by FRF.20, the necessary FRF.20 negotiations messages can be generated to control the connecting router, not only to turn on IP header compression but also to adjust the compression parameters. The FRF.20 negotiation (or control) message is composed in the FRF.20 adapter by interpreting the incoming router request message. Many of the fields are simply transcribed from request to response while the control field indicating response and type are modified.

  12. Analysis of Wave Predictions from the Coastal Model Test Bed using Operationally Estimated Bathymetry

    NASA Astrophysics Data System (ADS)

    Bak, S.; Smith, J. M.; Hesser, T.; Bryant, M. A.

    2016-12-01

    Near-coast wave models are generally validated with relatively small data sets that focus on analytical solutions, specialized experiments, or intense storms. Prior studies have compiled testbeds that include a few dozen experiments or storms to validate models (e.g., Ris et al. 2002), but few examples exist that allow for continued model evaluation in the nearshore and surf-zone in near-realtime. The limited nature of these validation sets is driven by a lack of high spatial and temporal resolution in-situ wave measurements and the difficulty in maintaining these instruments on the active profile over long periods of time. The US Army Corps of Engineers Field Research Facility (FRF) has initiated a Coastal Model Test-Bed (CMTB), which is an automated system that continually validates wave models (with morphological and circulation models to follow) utilizing the rich data set of oceanographic and bathymetric measurements collected at the FRF. The FRF's cross-shore wave array provides wave measurements along a cross-shore profile from 26 m of water depth to the shoreline, utilizing various instruments including wave-rider buoys, AWACs, aquadopps, pressure gauges, and a dune-mounted lidar (Brodie et al. 2015). This work uses the CMTB to evaluate the performance of a phase-averaged numerical wave model, STWAVE (Smith 2007, Massey et al. 2011) over the course of a year at the FRF in Duck, NC. Additionally, from the BathyDuck Experiment in October 2015, the CMTB was used to determine the impact of applying the depth boundary condition for the model from monthly acoustic bathymetric surveys in comparison to hourly estimates using a video-based inversion method (e.g., cBathy, Holman et al. 2013). The modeled wave parameters using both bathymetric boundary conditions are evaluated using the FRF's cross-shore wave array and two additional cross-shore arrays of wave measurements in 2 to 4 m water depth from BathyDuck in Fall, 2015.

  13. Role of the flavonoid-rich fraction in the antioxidant and cytotoxic activities of Bauhinia forficata Link. (Fabaceae) leaves extract.

    PubMed

    Miceli, Natalizia; Buongiorno, Luigina Pasqualina; Celi, Maria Grazia; Cacciola, Francesco; Dugo, Paola; Donato, Paola; Mondello, Luigi; Bonaccorsi, Irene; Taviano, Maria Fernanda

    2016-06-01

    Bauhinia forficata Link. is utilised as an antidiabetic in Brazilian folk-medicine; furthermore, its antioxidant properties suggest a potential usefulness in the prevention of diabetes complications associated with oxidative stress. The contribution of a flavonoid-rich fraction (FRF), HPLC-PDA-ESI-MS characterised, to the antioxidant and cytotoxic properties of B. forficata hydro-alcoholic leaves extract was evaluated for the first time. Both extract and FRF showed radical-scavenging activity and reducing power with a strong relationship with the flavonoid content found; hence, flavonoids are mainly responsible for the primary antioxidant activity of B. forficata extract. The extract significantly decreased FO-1 cell viability at the higher concentrations. FRF did not exert any effect; thus, flavonoids do not appear to be responsible for the cytotoxicity of the extract. The extract resulted virtually non-toxic against both Artemia salina and normal human lymphocytes, demonstrating potential selectivity in inhibiting cancer cell growth. Finally, no antimicrobial activity was observed against the bacteria and yeasts tested.

  14. Net global warming potential and greenhouse gas intensity as affected by different water management strategies in Chinese double rice-cropping systems.

    PubMed

    Wu, Xiaohong; Wang, Wei; Xie, Xiaoli; Yin, Chunmei; Hou, Haijun; Yan, Wende; Wang, Guangjun

    2018-01-15

    This study provides a complete account of global warming potential (GWP) and greenhouse gas intensity (GHGI) in relation to a long-term water management experiment in Chinese double-rice cropping systems. The three strategies of water management comprised continuous (year-round) flooding (CF), flooding during the rice season but with drainage during the midseason and harvest time (F-D-F), and irrigation only for flooding during transplanting and the tillering stage (F-RF). The CH 4 and N 2 O fluxes were measured with the static chamber method. Soil organic carbon (SOC) sequestration rates were estimated based on the changes in the carbon stocks during 1998-2014. Longer periods of soil flooding led to increased CH 4 emissions, reduced N 2 O emissions, and enhanced SOC sequestration. The net GWPs were 22,497, 8,895, and 1,646 kg CO 2 -equivalent ha -1 yr -1 for the CF, F-D-F, and F-RF, respectively. The annual rice grain yields were comparable between the F-D-F and CF, but were reduced significantly (by 13%) in the F-RF. The GHGIs were 2.07, 0.87, and 0.18 kg CO 2 -equivalent kg -1 grain yr -1 for the CF, F-D-F, and F-RF, respectively. These results suggest that F-D-F could be used to maintain the grain yields and simultaneously mitigate the climatic impact of double rice-cropping systems.

  15. Non-parametric identification of multivariable systems: A local rational modeling approach with application to a vibration isolation benchmark

    NASA Astrophysics Data System (ADS)

    Voorhoeve, Robbert; van der Maas, Annemiek; Oomen, Tom

    2018-05-01

    Frequency response function (FRF) identification is often used as a basis for control systems design and as a starting point for subsequent parametric system identification. The aim of this paper is to develop a multiple-input multiple-output (MIMO) local parametric modeling approach for FRF identification of lightly damped mechanical systems with improved speed and accuracy. The proposed method is based on local rational models, which can efficiently handle the lightly-damped resonant dynamics. A key aspect herein is the freedom in the multivariable rational model parametrizations. Several choices for such multivariable rational model parametrizations are proposed and investigated. For systems with many inputs and outputs the required number of model parameters can rapidly increase, adversely affecting the performance of the local modeling approach. Therefore, low-order model structures are investigated. The structure of these low-order parametrizations leads to an undesired directionality in the identification problem. To address this, an iterative local rational modeling algorithm is proposed. As a special case recently developed SISO algorithms are recovered. The proposed approach is successfully demonstrated on simulations and on an active vibration isolation system benchmark, confirming good performance of the method using significantly less parameters compared with alternative approaches.

  16. Estimating outcomes of astronauts with myocardial infarction in exploration class space missions.

    PubMed

    Gillis, David B; Hamilton, Douglas R

    2012-02-01

    We estimate likelihood of presenting rhythms and survival to hospital discharge outcome after acute cardiac ischemia with arrhythmia and/or myocardial infarction (AMI) during long-duration space missions (LDSM) using selected terrestrial cohorts in medical literature. Medical scenarios were risk-stratified by coronary artery calcium score (CAC) and Framingham risk factors (FRF). AMI with and without sudden cardiac arrest (SCA) likelihoods and clinically significant rhythm scenarios and associated outcomes in "astronaut-like" cohorts were derived from two prospective trials identified by an evidence-based literature review. Results are presented using an event sequence diagram and event time line. The association of increasing CAC scores and FRF with AMI and SCA outcomes was calculated. Low AMI likelihoods are estimated in individuals with CAC scores of zero or < 100 and a low number of FRF. Survival rate to hospital discharge after out of hospital SCA in a large urban environment study was 5.2%. EMS-witnessed ventricular tachycardia and/or ventricular fibrillation survival rate of 37.5% represents < 1% of all urban out of hospital AMI, and these patients have a high proportion of known ischemic cardiovascular and pulmonary disease "disqualifying for spaceflight." Multiple factors may be expected to delay or defeat rapid access to "chain of survival" resources during LDSM, lowering survival rates below urban levels of 5.2%. Low CAC and FRF reflect lower risk for AMI events. Zero CAC was associated with the lowest risk of AMI after 3.5 yr of follow-up. Quantifiable incidence and outcome characterization suggests AMI in LDSM outcomes will be relatively independent of in-flight medical resources.

  17. Long-Term Clinical and Histological Effects of a Bipolar Fractional Radiofrequency System in the Treatment of Facial Atrophic Acne Scars and Acne Vulgaris in Japanese Patients: A Series of Eight Cases.

    PubMed

    Kaminaka, Chikako; Furukawa, Fukumi; Yamamoto, Yuki

    2016-12-01

    This retrospective case series was designed to compare the long-term safety and efficacy of bipolar fractional radiofrequency (FRF) therapy as a treatment for atrophic acne scars (ASs) and acne vulgaris. Few clinical and histological studies have examined the long-term utility of bipolar FRF therapy as a treatment for ASs and acne in people with darker skin. Eight Japanese patients with ASs and mild-to-severe acne on both cheeks were treated with a bipolar FRF system (eMatrix; Syneron). Five treatment sessions with the same settings (coverage rate: 10%; peak energy: 62 mJ/pin; two passes) were carried out at 1-month intervals, and the patients were followed up for at least 1 year after the final treatment. Assessments of ASs and acne severity were performed and samples were removed for histological examination. We demonstrated that mild ASs responded better than moderate and severe ASs, and at least 50% improvement in scar severity was seen in 50% of patients after the final treatment. Six patients remained disease free at 1.5 years without the use of any additional therapies. The biopsy specimens showed a marked improvement characterized by a decrease in dermal pilosebaceous units and perivascular inflammatory cell infiltrates with an increase in elastin content and collagen deposition in the upper dermis. Bipolar FRF treatment showed long-term effectiveness against mild ASs and acne in Asian patients and had minimal side effects.

  18. Comparison of FRF measurements and mode shapes determined using optically image based, laser, and accelerometer measurements

    NASA Astrophysics Data System (ADS)

    Warren, Christopher; Niezrecki, Christopher; Avitabile, Peter; Pingle, Pawan

    2011-08-01

    Today, accelerometers and laser Doppler vibrometers are widely accepted as valid measurement tools for structural dynamic measurements. However, limitations of these transducers prevent the accurate measurement of some phenomena. For example, accelerometers typically measure motion at a limited number of discrete points and can mass load a structure. Scanning laser vibrometers have a very wide frequency range and can measure many points without mass-loading, but are sensitive to large displacements and can have lengthy acquisition times due to sequential measurements. Image-based stereo-photogrammetry techniques provide additional measurement capabilities that compliment the current array of measurement systems by providing an alternative that favors high-displacement and low-frequency vibrations typically difficult to measure with accelerometers and laser vibrometers. Within this paper, digital image correlation, three-dimensional (3D) point-tracking, 3D laser vibrometry, and accelerometer measurements are all used to measure the dynamics of a structure to compare each of the techniques. Each approach has its benefits and drawbacks, so comparative measurements are made using these approaches to show some of the strengths and weaknesses of each technique. Additionally, the displacements determined using 3D point-tracking are used to calculate frequency response functions, from which mode shapes are extracted. The image-based frequency response functions (FRFs) are compared to those obtained by collocated accelerometers. Extracted mode shapes are then compared to those of a previously validated finite element model (FEM) of the test structure and are shown to have excellent agreement between the FEM and the conventional measurement approaches when compared using the Modal Assurance Criterion (MAC) and Pseudo-Orthogonality Check (POC).

  19. FRF-based structural damage detection of controlled buildings with podium structures: Experimental investigation

    NASA Astrophysics Data System (ADS)

    Xu, Y. L.; Huang, Q.; Zhan, S.; Su, Z. Q.; Liu, H. J.

    2014-06-01

    How to use control devices to enhance system identification and damage detection in relation to a structure that requires both vibration control and structural health monitoring is an interesting yet practical topic. In this study, the possibility of using the added stiffness provided by control devices and frequency response functions (FRFs) to detect damage in a building complex was explored experimentally. Scale models of a 12-storey main building and a 3-storey podium structure were built to represent a building complex. Given that the connection between the main building and the podium structure is most susceptible to damage, damage to the building complex was experimentally simulated by changing the connection stiffness. To simulate the added stiffness provided by a semi-active friction damper, a steel circular ring was designed and used to add the related stiffness to the building complex. By varying the connection stiffness using an eccentric wheel excitation system and by adding or not adding the circular ring, eight cases were investigated and eight sets of FRFs were measured. The experimental results were used to detect damage (changes in connection stiffness) using a recently proposed FRF-based damage detection method. The experimental results showed that the FRF-based damage detection method could satisfactorily locate and quantify damage.

  20. A data driven model for dune morphodynamics

    NASA Astrophysics Data System (ADS)

    Palmsten, M.; Brodie, K.; Spore, N.

    2016-12-01

    Dune morphology results from a number of competing feedbacks between wave, Aeolian, and biologic processes. Only now are conceptual and numerical models for dunes beginning to incorporate all aspects of the processes driving morphodynamics. Drawing on a 35-year record of observations of dune morphology and forcing conditions at the Army Corps of Engineers Field Research Facility (FRF) at Duck, NC, USA, we hypothesize that local dune morphology results from the competition between dune growth during dry windy periods and erosion during storms. We test our hypothesis by developing a data driven model using a Bayesian network to hindcast dune-crest elevation change, dune position change, and shoreline position change. Model inputs include a description of dune morphology from dune-crest elevation, dune-base elevation, dune width, and beach width. Wave forcing and the effect of moisture is parameterized in terms of the maximum total water level and period that waves impact the dunes, along with precipitation. Aeolian forcing is parameterized in terms of maximum wind speed, direction and period that wind exceeds a critical value for sediment transport. We test the sensitivity of our model to forcing parameters and hindcast the 35-year record of dune morphodynamics at the FRF. We also discuss the role of vegetation on dune morphologic differences observed at the FRF.

  1. GRC-2010-C-01237

    NASA Image and Video Library

    2006-03-29

    Fiber-Reinforced-Foam (FRF) Core Composite Sandwich Panel Concept for Advanced Composites Technologies Project - Preliminary Manufacturing Demonstration Articles for Ares V Payload Shroud Barrel Acreage Structure

  2. GRC-2010-C-01234

    NASA Image and Video Library

    2006-03-29

    Fiber-Reinforced-Foam (FRF) Core Composite Sandwich Panel Concept for Advanced Composites Technologies Project - Preliminary Manufacturing Demonstration Articles for Ares V Payload Shroud Barrel Acreage Structure

  3. GRC-2010-C-01233

    NASA Image and Video Library

    2006-03-29

    Fiber-Reinforced-Foam (FRF) Core Composite Sandwich Panel Concept for Advanced Composites Technologies Project - Preliminary Manufacturing Demonstration Articles for Ares V Payload Shroud Barrel Acreage Structure

  4. Hydro and morphodynamic simulations for probabilistic estimates of munitions mobility

    NASA Astrophysics Data System (ADS)

    Palmsten, M.; Penko, A.

    2017-12-01

    Probabilistic estimates of waves, currents, and sediment transport at underwater munitions remediation sites are necessary to constrain probabilistic predictions of munitions exposure, burial, and migration. To address this need, we produced ensemble simulations of hydrodynamic flow and morphologic change with Delft3D, a coupled system of wave, circulation, and sediment transport models. We have set up the Delft3D model simulations at the Army Corps of Engineers Field Research Facility (FRF) in Duck, NC, USA. The FRF is the prototype site for the near-field munitions mobility model, which integrates far-field and near-field field munitions mobility simulations. An extensive array of in-situ and remotely sensed oceanographic, bathymetric, and meteorological data are available at the FRF, as well as existing observations of munitions mobility for model testing. Here, we present results of ensemble Delft3D hydro- and morphodynamic simulations at Duck. A nested Delft3D simulation runs an outer grid that extends 12-km in the along-shore and 3.7-km in the cross-shore with 50-m resolution and a maximum depth of approximately 17-m. The inner nested grid extends 3.2-km in the along-shore and 1.2-km in the cross-shore with 5-m resolution and a maximum depth of approximately 11-m. The inner nested grid initial model bathymetry is defined as the most recent survey or remotely sensed estimate of water depth. Delft3D-WAVE and FLOW is driven with spectral wave measurements from a Waverider buoy in 17-m depth located on the offshore boundary of the outer grid. The spectral wave output and the water levels from the outer grid are used to define the boundary conditions for the inner nested high-resolution grid, in which the coupled Delft3D WAVE-FLOW-MORPHOLOGY model is run. The ensemble results are compared to the wave, current, and bathymetry observations collected at the FRF.

  5. Experimental determination of frequency response function estimates for flexible joint industrial manipulators with serial kinematics

    NASA Astrophysics Data System (ADS)

    Saupe, Florian; Knoblach, Andreas

    2015-02-01

    Two different approaches for the determination of frequency response functions (FRFs) are used for the non-parametric closed loop identification of a flexible joint industrial manipulator with serial kinematics. The two applied experiment designs are based on low power multisine and high power chirp excitations. The main challenge is to eliminate disturbances of the FRF estimates caused by the numerous nonlinearities of the robot. For the experiment design based on chirp excitations, a simple iterative procedure is proposed which allows exploiting the good crest factor of chirp signals in a closed loop setup. An interesting synergy of the two approaches, beyond validation purposes, is pointed out.

  6. Modal analysis using a Fourier analyzer, curve-fitting, and modal tuning

    NASA Technical Reports Server (NTRS)

    Craig, R. R., Jr.; Chung, Y. T.

    1981-01-01

    The proposed modal test program differs from single-input methods in that preliminary data may be acquired using multiple inputs, and modal tuning procedures may be employed to define closely spaced frquency modes more accurately or to make use of frequency response functions (FRF's) which are based on several input locations. In some respects the proposed modal test proram resembles earlier sine-sweep and sine-dwell testing in that broadband FRF's are acquired using several input locations, and tuning is employed to refine the modal parameter estimates. The major tasks performed in the proposed modal test program are outlined. Data acquisition and FFT processing, curve fitting, and modal tuning phases are described and examples are given to illustrate and evaluate them.

  7. Pre-impact lower extremity posture and brake pedal force predict foot and ankle forces during an automobile collision.

    PubMed

    Hardin, E C; Su, A; van den Bogert, A J

    2004-12-01

    The purpose of this study was to determine how a driver's foot and ankle forces during a frontal vehicle collision depend on initial lower extremity posture and brake pedal force. A 2D musculoskeletal model with seven segments and six right-side muscle groups was used. A simulation of a three-second braking task found 3647 sets of muscle activation levels that resulted in stable braking postures with realistic pedal force. These activation patterns were then used in impact simulations where vehicle deceleration was applied and driver movements and foot and ankle forces were simulated. Peak rearfoot ground reaction force (F(RF)), peak Achilles tendon force (FAT), peak calcaneal force (F(CF)) and peak ankle joint force (F(AJ)) were calculated. Peak forces during the impact simulation were 476 +/- 687 N (F(RF)), 2934 +/- 944 N (F(CF)) and 2449 +/- 918 N (F(AJ)). Many simulations resulted in force levels that could cause fractures. Multivariate quadratic regression determined that the pre-impact brake pedal force (PF), knee angle (KA) and heel distance (HD) explained 72% of the variance in peak FRF, 62% in peak F(CF) and 73% in peak F(AJ). Foot and ankle forces during a collision depend on initial posture and pedal force. Braking postures with increased knee flexion, while keeping the seat position fixed, are associated with higher foot and ankle forces during a collision.

  8. Tire-road friction coefficient estimation based on the resonance frequency of in-wheel motor drive system

    NASA Astrophysics Data System (ADS)

    Chen, Long; Bian, Mingyuan; Luo, Yugong; Qin, Zhaobo; Li, Keqiang

    2016-01-01

    In this paper, a resonance frequency-based tire-road friction coefficient (TRFC) estimation method is proposed by considering the dynamics performance of the in-wheel motor drive system under small slip ratio conditions. A frequency response function (FRF) is deduced for the drive system that is composed of a dynamic tire model and a simplified motor model. A linear relationship between the squared system resonance frequency and the TFRC is described with the FRF. Furthermore, the resonance frequency is identified by the Auto-Regressive eXogenous model using the information of the motor torque and the wheel speed, and the TRFC is estimated thereafter by a recursive least squares filter with the identified resonance frequency. Finally, the effectiveness of the proposed approach is demonstrated through simulations and experimental tests on different road surfaces.

  9. Buckling Design and Analysis of a Payload Fairing One-Sixth Cylindrical Arc-Segment Panel

    NASA Technical Reports Server (NTRS)

    Kosareo, Daniel N.; Oliver, Stanley T.; Bednarcyk, Brett A.

    2013-01-01

    Design and analysis results are reported for a panel that is a 16th arc-segment of a full 33-ft diameter cylindrical barrel section of a payload fairing structure. Six such panels could be used to construct the fairing barrel, and, as such, compression buckling testing of a 16th arc-segment panel would serve as a validation test of the buckling analyses used to design the fairing panels. In this report, linear and nonlinear buckling analyses have been performed using finite element software for 16th arc-segment panels composed of aluminum honeycomb core with graphiteepoxy composite facesheets and an alternative fiber reinforced foam (FRF) composite sandwich design. The cross sections of both concepts were sized to represent realistic Space Launch Systems (SLS) Payload Fairing panels. Based on shell-based linear buckling analyses, smaller, more manageable buckling test panel dimensions were determined such that the panel would still be expected to buckle with a circumferential (as opposed to column-like) mode with significant separation between the first and second buckling modes. More detailed nonlinear buckling analyses were then conducted for honeycomb panels of various sizes using both Abaqus and ANSYS finite element codes, and for the smaller size panel, a solid-based finite element analysis was conducted. Finally, for the smaller size FRF panel, nonlinear buckling analysis was performed wherein geometric imperfections measured from an actual manufactured FRF were included. It was found that the measured imperfection did not significantly affect the panel's predicted buckling response

  10. Study on the Effect of the Impact Location and the Type of Hammer Tip on the Frequency Response Function (FRF) in Experimental Modal Analysis of Rectangular Plates

    NASA Astrophysics Data System (ADS)

    Mali, K. D.; Singru, P. M.

    2018-03-01

    In this work effect of the impact location and the type of hammer tip on the frequency response function (FRF) is studied. Experimental modal analysis of rectangular plates is carried out for this purpose by using impact hammer, accelerometer and fast Fourier transform (FFT) analyzer. It is observed that the impulse hammer hit location has, no effect on the eigenfrequency, yet a difference in amplitude of the eigenfrequencies is obtained. The effect of the hammer tip on the pulse and the force spectrum is studied for three types of tips metal, plastic and rubber. A solid rectangular plate was excited by using these tips one by one in three different tests. It is observed that for present experimental set up plastic tip excites the useful frequency range.

  11. Issues concerning the updating of finite-element models from experimental data

    NASA Technical Reports Server (NTRS)

    Dunn, Shane A.

    1994-01-01

    Some issues concerning the updating of dynamic finite-element models by incorporation of experimental data are examined here. It is demonstrated how the number of unknowns can be greatly reduced if the physical nature of the model is maintained. The issue of uniqueness is also examined and it is shown that a number of previous workers have been mistaken in their attempts to define both sufficient and necessary measurement requirements for the updating problem to be solved uniquely. The relative merits of modal and frequency response function (frf) data are discussed and it is shown that for measurements at fewer degrees of freedom than are present in the model, frf data will be unlikely to converge easily to a solution. It is then examined how such problems may become more tractable by using new experimental techniques which would allow measurements at all degrees of freedom present in the mathematical model.

  12. Control of a flexible link by shaping the closed loop frequency response function through optimised feedback filters

    NASA Astrophysics Data System (ADS)

    Del Vescovo, D.; D'Ambrogio, W.

    1995-01-01

    A frequency domain method is presented to design a closed-loop control for vibration reduction flexible mechanisms. The procedure is developed on a single-link flexible arm, driven by one rotary degree of freedom servomotor, although the same technique may be applied to similar systems such as supports for aerospace antennae or solar panels. The method uses the structural frequency response functions (FRFs), thus avoiding system identification, that produces modeling uncertainties. Two closed-loops are implemented: the inner loop uses acceleration feedback with the aim of making the FRF similar to that of an equivalent rigid link; the outer loop feeds back displacements to achieve a fast positioning response and null steady state error. In both cases, the controller type is established a priori, while actual characteristics are defined by an optimisation procedure in which the relevant FRF is constrained into prescribed bounds and stability is taken into account.

  13. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  14. Aerobraking Maneuver (ABM) Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forrest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    abmREPORT Version 3.1 is a Perl script that extracts vital summarization information from the Mars Reconnaissance Orbiter (MRO) aerobraking ABM build process. This information facilitates sequence reviews, and provides a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files and burn magnitude configuration files and presents them in a single, easy-to-check report that provides the majority of the parameters necessary for cross check and verification during the sequence review process. This means that needed information, formerly spread across a number of different files and each in a different format, is all available in this one application. This program is built on the capabilities developed in dragReport and then the scripts evolved as the two tools continued to be developed in parallel.

  15. Investigation of the relationship between hurricane waves and extreme runup

    NASA Astrophysics Data System (ADS)

    Thompson, D. M.; Stockdon, H. F.

    2006-12-01

    In addition to storm surge, the elevation of wave-induced runup plays a significant role in forcing geomorphic change during extreme storms. Empirical formulations for extreme runup, defined as the 2% exceedence level, are dependent on some measure of significant offshore wave height. Accurate prediction of extreme runup, particularly during hurricanes when wave heights are large, depends on selecting the most appropriate measure of wave height that provides energy to the nearshore system. Using measurements from deep-water wave buoys results in an overprediction of runup elevation. Under storm forcing these large waves dissipate across the shelf through friction, whitecapping and depth-limited breaking before reaching the beach and forcing swash processes. The use of a local, shallow water wave height has been shown to provide a more accurate estimate of extreme runup elevation (Stockdon, et. al. 2006); however, a specific definition of this local wave height has yet to be defined. Using observations of nearshore waves from the U.S. Army Corps of Engineers' Field Research Facility (FRF) in Duck, NC during Hurricane Isabel, the most relevant measure of wave height for use in empirical runup parameterizations was examined. Spatial and temporal variability of the hurricane wave field, which made landfall on September 18, 2003, were modeled using SWAN. Comparisons with wave data from FRF gages and deep-water buoys operated by NOAA's National Data Buoy Center were used for model calibration. Various measures of local wave height (breaking, dissipation-based, etc.) were extracted from the model domain and used as input to the runup parameterizations. Video based observations of runup collected at the FRF during the storm were used to ground truth modeled values. Assessment of the most appropriate measure of wave height can be extended over a large area through comparisons to observations of storm- induced geomorphic change.

  16. The United States of America and scientific research.

    PubMed

    Hather, Gregory J; Haynes, Winston; Higdon, Roger; Kolker, Natali; Stewart, Elizabeth A; Arzberger, Peter; Chain, Patrick; Field, Dawn; Franza, B Robert; Lin, Biaoyang; Meyer, Folker; Ozdemir, Vural; Smith, Charles V; van Belle, Gerald; Wooley, John; Kolker, Eugene

    2010-08-16

    To gauge the current commitment to scientific research in the United States of America (US), we compared federal research funding (FRF) with the US gross domestic product (GDP) and industry research spending during the past six decades. In order to address the recent globalization of scientific research, we also focused on four key indicators of research activities: research and development (R&D) funding, total science and engineering doctoral degrees, patents, and scientific publications. We compared these indicators across three major population and economic regions: the US, the European Union (EU) and the People's Republic of China (China) over the past decade. We discovered a number of interesting trends with direct relevance for science policy. The level of US FRF has varied between 0.2% and 0.6% of the GDP during the last six decades. Since the 1960s, the US FRF contribution has fallen from twice that of industrial research funding to roughly equal. Also, in the last two decades, the portion of the US government R&D spending devoted to research has increased. Although well below the US and the EU in overall funding, the current growth rate for R&D funding in China greatly exceeds that of both. Finally, the EU currently produces more science and engineering doctoral graduates and scientific publications than the US in absolute terms, but not per capita. This study's aim is to facilitate a serious discussion of key questions by the research community and federal policy makers. In particular, our results raise two questions with respect to: a) the increasing globalization of science: "What role is the US playing now, and what role will it play in the future of international science?"; and b) the ability to produce beneficial innovations for society: "How will the US continue to foster its strengths?"

  17. Water consumption, grain yield, and water productivity in response to field water management in double rice systems in China.

    PubMed

    Wu, Xiao Hong; Wang, Wei; Yin, Chun Mei; Hou, Hai Jun; Xie, Ke Jun; Xie, Xiao Li

    2017-01-01

    Rice cultivation has been challenged by increasing food demand and water scarcity. We examined the responses of water use, grain yield, and water productivity to various modes of field water managements in Chinese double rice systems. Four treatments were studied in a long-term field experiment (1998-2015): continuous flooding (CF), flooding-midseason drying-flooding (F-D-F), flooding-midseason drying-intermittent irrigation without obvious standing water (F-D-S), and flooding-rain-fed (F-RF). The average precipitation was 483 mm in early-rice season and 397 mm in late-rice season. The irrigated water for CF, F-D-F, F-D-S, and F-RF, respectively, was 263, 340, 279, and 170 mm in early-rice season, and 484, 528, 422, and 206 mm in late-rice season. Grain yield for CF, F-D-F, F-D-S, and F-RF, respectively, was 4,722, 4,597, 4,479, and 4,232 kgha-1 in early-rice season, and 5,420, 5,402, 5,366, and 4,498 kgha-1 in late-rice season. Compared with CF, F-D-F consumed more irrigated water, which still decreased grain yield, leading to a decrease in water productivity by 25% in early-rice season and by 8% in late-rice season. Compared with F-D-F, F-D-S saved much irrigated water with a small yield reduction, leading to an increase in water productivity by 22% in early-rice season and by 26% in late-rice season. The results indicate that CF is best for early-rice and FDS is best for late-rice in terms of grain yield and water productivity.

  18. First-principles calculations on elastic, magnetoelastic, and phonon properties of Ni2FeGa magnetic shape memory alloys

    NASA Astrophysics Data System (ADS)

    He, Wangqiang; Huang, Houbing; Liu, Zhuhong; Ma, Xingqiao

    2018-01-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11174030 and 11504020) and the Fundamental Research Funds for the Central Universities of China (Grant No. FRF-TP-16-064A1, 06500031).

  19. Data-Informed Large-Eddy Simulation of Coastal Land-Air-Sea Interactions

    NASA Astrophysics Data System (ADS)

    Calderer, A.; Hao, X.; Fernando, H. J.; Sotiropoulos, F.; Shen, L.

    2016-12-01

    The study of atmospheric flows in coastal areas has not been fully addressed due to the complex processes emerging from the land-air-sea interactions, e.g., abrupt change in land topography, strong current shear, wave shoaling, and depth-limited wave breaking. The available computational tools that have been applied to study such littoral regions are mostly based on open-ocean assumptions, which most times do not lead to reliable solutions. The goal of the present study is to better understand some of these near-shore processes, employing the advanced computational tools, developed in our research group. Our computational framework combines a large-eddy simulation (LES) flow solver for atmospheric flows, a sharp-interface immersed boundary method that can deal with real complex topographies (Calderer et al., J. Comp. Physics 2014), and a phase-resolved, depth-dependent, wave model (Yang and Shen, J. Comp. Physics 2011). Using real measured data taken in the FRF station in Duck, North Carolina, we validate and demonstrate the predictive capabilities of the present computational framework, which are shown to be in overall good agreement with the measured data under different wind-wave scenarios. We also analyse the effects of some of the complex processes captured by our simulation tools.

  20. Structural and Acoustic Numerical Modeling of a Curved Composite Honeycomb Panel

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Buehrle, Ralph D.; Robinson, Jay H.

    2001-01-01

    The finite and boundary element modeling of the curved section of a composite honeycomb aircraft fuselage sidewall was validated for both structural response and acoustic radiation. The curved panel was modeled in the pre-processor MSC/PATRAN. Geometry models of the curved panel were constructed based on the physical dimensions of the test article. Material properties were obtained from the panel manufacturer. Finite element models were developed to predict the modal parameters for free and supported panel boundary conditions up to a frequency of 600 Hz. Free boundary conditions were simulated by providing soft foam support under the four comers of the panel or by suspending the panel from elastic bands. Supported boundary conditions were obtained by clamping the panel between plastic tubing seated in grooves along the perimeter of a stiff and heavy frame. The frame was installed in the transmission loss window of the Structural Acoustic Loads and Transmission (SALT) facility at NASA Langley Research Center. The structural response of the curved panel due to point force excitation was predicted using MSC/NASTRAN and the radiated sound was computed with COMET/Acoustics. The predictions were compared with the results from experimental modal surveys and forced response tests on the fuselage panel. The finite element models were refined and updated to provide optimum comparison with the measured modal data. Excellent agreement was obtained between the numerical and experimental modal data for the free as well as for the supported boundary conditions. Frequency response functions (FRF) were computed relating the input force excitation at one panel location to the surface acceleration response at five panel locations. Frequency response functions were measured at the same locations on the test specimen and were compared with the calculated FRF values. Good agreement was obtained for the real and imaginary parts of the transfer functions when modal participation was allowed up to 3000 Hz. The validated finite element model was used to predict the surface velocities due to the point force excitation. Good agreement was obtained between the spatial characteristics of the predicted and measured surface velocities. The measured velocity data were input into the acoustic boundary element code to compute the sound radiated by the panel. The predicted sound pressure levels in the far-field of the panel agreed well with the sound pressure levels measured at the same location.

  1. Experimental estimation of transmissibility matrices for industrial multi-axis vibration isolation systems

    NASA Astrophysics Data System (ADS)

    Beijen, Michiel A.; Voorhoeve, Robbert; Heertjes, Marcel F.; Oomen, Tom

    2018-07-01

    Vibration isolation is essential for industrial high-precision systems to suppress external disturbances. The aim of this paper is to develop a general identification approach to estimate the frequency response function (FRF) of the transmissibility matrix, which is a key performance indicator for vibration isolation systems. The major challenge lies in obtaining a good signal-to-noise ratio in view of a large system weight. A non-parametric system identification method is proposed that combines floor and shaker excitations. Furthermore, a method is presented to analyze the input power spectrum of the floor excitations, both in terms of magnitude and direction. In turn, the input design of the shaker excitation signals is investigated to obtain sufficient excitation power in all directions with minimum experiment cost. The proposed methods are shown to provide an accurate FRF of the transmissibility matrix in three relevant directions on an industrial active vibration isolation system over a large frequency range. This demonstrates that, despite their heavy weight, industrial vibration isolation systems can be accurately identified using this approach.

  2. Dynamics of a passive micro-vibration isolator based on a pretensioned plane cable net structure and fluid damper

    NASA Astrophysics Data System (ADS)

    Chen, Yanhao; Lu, Qi; Jing, Bo; Zhang, Zhiyi

    2016-09-01

    This paper addresses dynamic modelling and experiments on a passive vibration isolator for application in the space environment. The isolator is composed of a pretensioned plane cable net structure and a fluid damper in parallel. Firstly, the frequency response function (FRF) of a single cable is analysed according to the string theory, and the FRF synthesis method is adopted to establish a dynamic model of the plane cable net structure. Secondly, the equivalent damping coefficient of the fluid damper is analysed. Thirdly, experiments are carried out to compare the plane cable net structure, the fluid damper and the vibration isolator formed by the net and the damper, respectively. It is shown that the plane cable net structure can achieve substantial vibration attenuation but has a great amplification at its resonance frequency due to the light damping of cables. The damping effect of fluid damper is acceptable without taking the poor carrying capacity into consideration. Compared to the plane cable net structure and the fluid damper, the isolator has an acceptable resonance amplification as well as vibration attenuation.

  3. Identifying Preserved Storm Events on Beaches from Trenches and Cores

    NASA Astrophysics Data System (ADS)

    Wadman, H. M.; Gallagher, E. L.; McNinch, J.; Reniers, A.; Koktas, M.

    2014-12-01

    Recent research suggests that even small scale variations in grain size in the shallow stratigraphy of sandy beaches can significantly influence large-scale morphology change. However, few quantitative studies of variations in shallow stratigraphic layers, as differentiated by variations in mean grain size, have been conducted, in no small part due to the difficulty of collecting undisturbed sediment cores in the energetic lower beach and swash zone. Due to this lack of quantitative stratigraphic grain size data, most coastal morphology models assume that uniform grain sizes dominate sandy beaches, allowing for little to no temporal or spatial variations in grain size heterogeneity. In a first-order attempt to quantify small-scale, temporal and spatial variations in beach stratigraphy, thirty-five vibracores were collected at the USACE Field Research Facility (FRF), Duck, NC, in March-April of 2014 using the FRF's Coastal Research and Amphibious Buggy (CRAB). Vibracores were collected at set locations along a cross-shore profile from the toe of the dune to a water depth of ~1m in the surf zone. Vibracores were repeatedly collected from the same locations throughout a tidal cycle, as well as pre- and post a nor'easter event. In addition, two ~1.5m deep trenches were dug in the cross-shore and along-shore directions (each ~14m in length) after coring was completed to allow better interpretation of the stratigraphic sequences observed in the vibracores. The elevations of coherent stratigraphic layers, as revealed in vibracore-based fence diagrams and trench data, are used to relate specific observed stratigraphic sequences to individual storm events observed at the FRF. These data provide a first-order, quantitative examination of the small-scale temporal and spatial variability of shallow grain size along an open, sandy coastline. The data will be used to refine morphological model predictions to include variations in grain size and associated shallow stratigraphy.

  4. The United States of America and Scientific Research

    PubMed Central

    Hather, Gregory J.; Haynes, Winston; Higdon, Roger; Kolker, Natali; Stewart, Elizabeth A.; Arzberger, Peter; Chain, Patrick; Field, Dawn; Franza, B. Robert; Lin, Biaoyang; Meyer, Folker; Ozdemir, Vural; Smith, Charles V.; van Belle, Gerald; Wooley, John; Kolker, Eugene

    2010-01-01

    To gauge the current commitment to scientific research in the United States of America (US), we compared federal research funding (FRF) with the US gross domestic product (GDP) and industry research spending during the past six decades. In order to address the recent globalization of scientific research, we also focused on four key indicators of research activities: research and development (R&D) funding, total science and engineering doctoral degrees, patents, and scientific publications. We compared these indicators across three major population and economic regions: the US, the European Union (EU) and the People's Republic of China (China) over the past decade. We discovered a number of interesting trends with direct relevance for science policy. The level of US FRF has varied between 0.2% and 0.6% of the GDP during the last six decades. Since the 1960s, the US FRF contribution has fallen from twice that of industrial research funding to roughly equal. Also, in the last two decades, the portion of the US government R&D spending devoted to research has increased. Although well below the US and the EU in overall funding, the current growth rate for R&D funding in China greatly exceeds that of both. Finally, the EU currently produces more science and engineering doctoral graduates and scientific publications than the US in absolute terms, but not per capita. This study's aim is to facilitate a serious discussion of key questions by the research community and federal policy makers. In particular, our results raise two questions with respect to: a) the increasing globalization of science: “What role is the US playing now, and what role will it play in the future of international science?”; and b) the ability to produce beneficial innovations for society: “How will the US continue to foster its strengths?” PMID:20808949

  5. Analysis of swept-sine runs during modal identification

    NASA Astrophysics Data System (ADS)

    Gloth, G.; Sinapius, M.

    2004-11-01

    Experimental modal analysis of large aerospace structures in Europe combine nowadays the benefits of the very reliable but time-consuming phase resonance method and the application of phase separation techniques evaluating frequency response functions (FRF). FRFs of a test structure can be determined by a variety of means. Applied excitation signal waveforms include harmonic signals like stepped-sine excitation, periodic signals like multi-sine excitation, transient signals like impulse and swept-sine excitation, and stochastic signals like random. The current article focuses on slow swept-sine excitation which is a good trade-off between magnitude of excitation level needed for large aircraft and testing time. However, recent ground vibration tests (GVTs) brought up that reliable modal data from swept-sine test runs depend on a proper data processing. The article elucidates the strategy of modal analysis based on swept-sine excitation. The standards for the application of slowly swept sinusoids defined by the international organisation for standardisation in ISO 7626 part 2 are critically reviewed. The theoretical background of swept-sine testing is expounded with particular emphasis to the transition through structural resonances. The effect of different standard procedures of data processing like tracking filter, fast Fourier transform (FFT), and data reduction via averaging are investigated with respect to their influence on the FRFs and modal parameters. Particular emphasis is given to FRF distortions evoked by unsuitable data processing. All data processing methods are investigated on a numerical example. Their practical usefulness is demonstrated on test data taken from a recent GVT on a large aircraft. The revision of ISO 7626 part 2 is suggested regarding the application of slow swept-sine excitation. Recommendations about the proper FRF estimation from slow swept-sine excitation are given in order to enable the optimisation on these applications for future modal survey tests of large aerospace structures.

  6. High versus moderate energy use of bipolar fractional radiofrequency in the treatment of acne scars: a split-face double-blinded randomized control trial pilot study.

    PubMed

    Phothong, Weeranut; Wanitphakdeedecha, Rungsima; Sathaworawong, Angkana; Manuskiatti, Woraphong

    2016-02-01

    Bipolar fractional radiofrequency (FRF) device was firstly FDA-approved for treating atrophic acne scar in 2008 through the process of dermal coagulation and minimal epidermal ablation. The average energy at 60 mJ/pin was widely used to treat atrophic acne scars. However, the higher energy was delivered, the deeper ablation and coagulation were found. At present, the new generation of a device with bipolar FRF technology with electrode-pin tip was developed to maximize ability to deliver energy up to 100 mJ/pin. The objective of the study was to explore and compare the efficacy of utilizing high energy (100 mJ/pin) and moderate energy (60 mJ/pin) of bipolar fractional radiofrequency in treatment of atrophic acne scar in Asians. This is a split-face, double-blinded, randomized control trial, pilot study by using parallel group design technique. Thirty healthy subjects with Fitzpatrick skin phototype III-IV diagnosed as atrophic acne scares were enrolled. All subjects received four monthly sessions of bipolar FRF treatment. Left and right facial sides of individual patients were randomly assigned for different energy (high energy at 100 mJ/pin versus moderate energy at 60 mJ/pin). Acne scars improvement was blinded graded by dermatologist using global acne scarring score (GASS) which was subjectively evaluated at baseline, 1-, 3-, and 6-month follow-up. Objective scar analysis was also done using UVA-light video camera to measure scar volume, skin smoothness, and wrinkle at baseline, 3-, and 6-month follow-up after the last treatment. Side effects including pain, erythema, swelling, and crusting were also recorded. Thirty subjects completed the study with full 4-treatment course. The mean GASS of high energy side and moderate energy side was significantly reduced at 1-, 3-, and 6-month follow-up visits. At 1 month follow-visit, high energy side demonstrated significant improvement compared with moderate energy side (p = 0.03). Postinflammatory hyperpigmentation (PIH) developed in 21/120 sessions in high energy side (17.5 %) and 16/120 sessions in moderate energy side (13.3 %). Pain score and the duration of erythema after treatments were significant higher on the side that was treated with high energy. Bipolar FRF device was safe and effective in the treatment of atrophic acne scars in Asians. High energy setting demonstrated significant higher efficacy at 1 month follow-visit. However, the efficacy of both energy settings was comparable at 3- and 6-month follow-up. In addition, side effects were significantly more intense on the side treated with high energy.

  7. Detection of Metallic and Electronic Radar Targets by Acoustic Modulation of Electromagnetic Waves

    DTIC Science & Technology

    2017-07-01

    reradiated wave is captured by the radar’s receive antenna. The presence of measurable EM energy at any discrete multiple of the audio frequency away...the radar receiver (Rx). The presence of measurable EM energy at any discrete multiple of faudio away from the original RF carrier fRF (i.e., at any n

  8. Feather retention force in broiler carcasses slaughtered and held up to 8 hours postmortem prior to scalding

    USDA-ARS?s Scientific Manuscript database

    One factor that could impact the feasibility of commercial on-farm slaughter of broilers is the time delay from on-farm slaughter to scalding and defeathering in the commercial plant that could be 4 h or more. This experiment evaluated feather retention force (FRF) in broilers that were slaughtered ...

  9. Demonstration and Validation of Materials for Corrosion-Resistant Fencing and Guard Railings in Aggressive Climates

    DTIC Science & Technology

    2015-10-01

    Treat Island, ME Constant Exposure Period (hours) Area (cm2) Pretest Weight (g) Post - Test Weight (g) Mass Loss (grams) Density in (g/cm3...2.3 Coupon monitoring and post -exposure lab testing ........................................ 17 3 Discussion... test sections installed at the FRF. .............................................................. 11 Figure 9. Profile of wire coupons on ASTM G7 rack

  10. The Validity of Multiple Choice Practical Examinations as an Alternative to Traditional Free Response Examination Formats in Gross Anatomy

    ERIC Educational Resources Information Center

    Shaibah, Hassan Sami; van der Vleuten, Cees P. M.

    2013-01-01

    Traditionally, an anatomy practical examination is conducted using a free response format (FRF). However, this format is resource-intensive, as it requires a relatively large time investment from anatomy course faculty in preparation and grading. Thus, several interventions have been reported where the response format was changed to a selected…

  11. Full-degrees-of-freedom frequency based substructuring

    NASA Astrophysics Data System (ADS)

    Drozg, Armin; Čepon, Gregor; Boltežar, Miha

    2018-01-01

    Dividing the whole system into multiple subsystems and a separate dynamic analysis is common practice in the field of structural dynamics. The substructuring process improves the computational efficiency and enables an effective realization of the local optimization, modal updating and sensitivity analyses. This paper focuses on frequency-based substructuring methods using experimentally obtained data. An efficient substructuring process has already been demonstrated using numerically obtained frequency-response functions (FRFs). However, the experimental process suffers from several difficulties, among which, many of them are related to the rotational degrees of freedom. Thus, several attempts have been made to measure, expand or combine numerical correction methods in order to obtain a complete response model. The proposed methods have numerous limitations and are not yet generally applicable. Therefore, in this paper an alternative approach based on experimentally obtained data only, is proposed. The force-excited part of the FRF matrix is measured with piezoelectric translational and rotational direct accelerometers. The incomplete moment-excited part of the FRF matrix is expanded, based on the modal model. The proposed procedure is integrated in a Lagrange Multiplier Frequency Based Substructuring method and demonstrated on a simple beam structure, where the connection coordinates are mainly associated with the rotational degrees of freedom.

  12. Complex mode indication function and its applications to spatial domain parameter estimation

    NASA Astrophysics Data System (ADS)

    Shih, C. Y.; Tsuei, Y. G.; Allemang, R. J.; Brown, D. L.

    1988-10-01

    This paper introduces the concept of the Complex Mode Indication Function (CMIF) and its application in spatial domain parameter estimation. The concept of CMIF is developed by performing singular value decomposition (SVD) of the Frequency Response Function (FRF) matrix at each spectral line. The CMIF is defined as the eigenvalues, which are the square of the singular values, solved from the normal matrix formed from the FRF matrix, [ H( jω)] H[ H( jω)], at each spectral line. The CMIF appears to be a simple and efficient method for identifying the modes of the complex system. The CMIF identifies modes by showing the physical magnitude of each mode and the damped natural frequency for each root. Since multiple reference data is applied in CMIF, repeated roots can be detected. The CMIF also gives global modal parameters, such as damped natural frequencies, mode shapes and modal participation vectors. Since CMIF works in the spatial domain, uneven frequency spacing data such as data from spatial sine testing can be used. A second-stage procedure for accurate damped natural frequency and damping estimation as well as mode shape scaling is also discussed in this paper.

  13. Identification of characteristic frequencies of damaged railway tracks using field hammer test measurements

    NASA Astrophysics Data System (ADS)

    Oregui, M.; Li, Z.; Dollevoet, R.

    2015-03-01

    In this paper, the feasibility of the Frequency Response Function (FRF)-based statistical method to identify the characteristic frequencies of railway track defects is studied. The method compares a damaged track state to a healthy state based on non-destructive field hammer test measurements. First, a study is carried out to investigate the repeatability of hammer tests in railway tracks. By changing the excitation and measurement locations it is shown that the variability introduced by the test process is negligible. Second, following the concepts of control charts employed in process monitoring, a method to define an approximate healthy state is introduced by using hammer test measurements at locations without visual damage. Then, the feasibility study includes an investigation into squats (i.e. a major type of rail surface defect) of varying severity. The identified frequency ranges related to squats agree with those found in an extensively validated vehicle-borne detection system. Therefore, the FRF-based statistical method in combination with the non-destructive hammer test measurements has the potential to be employed to identify the characteristic frequencies of damaged conditions in railway tracks in the frequency range of 300-3000 Hz.

  14. Numerical analysis of the flexible roll forming of an automotive component from high strength steel

    NASA Astrophysics Data System (ADS)

    Abeyrathna, B.; Abvabi, A.; Rolfe, B.; Taube, R.; Weiss, M.

    2016-11-01

    Conventional roll forming is limited to components with uniform cross-section; the recently developed flexible roll forming (FRF) process can be used to form components which vary in both width and depth. It has been suggested that this process can be used to manufacture automotive components from Ultra High Strength Steel (UHSS) which has limited tensile elongation. In the flexible roll forming process, the pre-cut blank is fed through a set of rolls; some rolls are computer-numerically controlled (CNC) to follow the 3D contours of the part and hence parts with a variable cross-section can be produced. This paper introduces a new flexible roll forming technique which can be used to form a complex shape with the minimum tooling requirements. In this method, the pre-cut blank is held between two dies and the whole system moves back and forth past CNC forming rolls. The forming roll changes its angle and position in each pass to incrementally form the part. In this work, the process is simulated using the commercial software package Copra FEA. The distribution of total strain and final part quality are investigated as well as related shape defects observed in the process. Different tooling concepts are used to improve the strain distribution and hence the part quality.

  15. Daily hydro- and morphodynamic simulations at Duck, NC, USA using Delft3D

    NASA Astrophysics Data System (ADS)

    Penko, Allison; Veeramony, Jay; Palmsten, Margaret; Bak, Spicer; Brodie, Katherine; Hesser, Tyler

    2017-04-01

    Operational forecasting of the coastal nearshore has wide ranging societal and humanitarian benefits, specifically for the prediction of natural hazards due to extreme storm events. However, understanding the model limitations and uncertainty is as equally important as the predictions themselves. By comparing and contrasting the predictions of multiple high-resolution models in a location with near real-time collection of observations, we are able to perform a vigorous analysis of the model results in order to achieve more robust and certain predictions. In collaboration with the U.S. Army Corps of Engineers Field Research Facility (USACE FRF) as part of the Coastal Model Test Bed (CMTB) project, we have set up Delft3D at Duck, NC, USA to run in near-real time, driven by measured wave data at the boundary. The CMTB at the USACE FRF allows for the unique integration of operational wave, circulation, and morphology models with real-time observations. The FRF has an extensive array of in-situ and remotely sensed oceanographic, bathymetric, and meteorological data that is broadcast in near-real time onto a publically accessible server. Wave, current, and bed elevation instruments are permanently installed across the model domain including 2 waverider buoys in 17-m and 26-m water depths at 3.5-km and 17-km offshore, respectively, that record directional wave data every 30-min. Here, we present the workflow and output of the Delft3D hydro- and morphodynamic simulations at Duck, and show the tactical benefits and operational potential of such a system. A nested Delft3D simulation runs a parent grid that extends 12-km in the along-shore and 3.5-km in the cross-shore with 50-m resolution and a maximum depth of approximately 17-m. The bathymetry for the parent grid was obtained from a regional digital elevation model (DEM) generated by the Federal Emergency Management Agency (FEMA). The inner nested grid extends 1.8-km in the along-shore and 1-km in the cross-shore with 5-m resolution and a maximum depth of approximately 8-m. The inner nested grid initial model bathymetry is set to either the predicted bathymetry from the previous day's simulation or a survey, whichever is more recent. Delft3D-WAVE runs in the parent grid and is driven with the real-time spectral wave measurements from the waverider buoy in 17-m depth. The spectral output from Delft3D-WAVE in the parent grid is then used as the boundary condition for the inner nested high-resolution grid, in which the coupled Delft3D wave-flow-morphology model is run. The model results are then compared to the wave, current, and bathymetry observations collected at the FRF as well as other models that are run in the CMTB.

  16. Initialization and Setup of the Coastal Model Test Bed: STWAVE

    DTIC Science & Technology

    2017-01-01

    Laboratory (CHL) Field Research Facility (FRF) in Duck , NC. The improved evaluation methodology will promote rapid enhancement of model capability and focus...Blanton 2008) study . This regional digital elevation model (DEM), with a cell size of 10 m, was generated from numerous datasets collected at different...INFORMATION: For additional information, contact Spicer Bak, Coastal Observation and Analysis Branch, Coastal and Hydraulics Laboratory, 1261 Duck Road

  17. U.S. EPA, Pesticide Product Label, COOKE SEVIN BRAND LIQUID CARBARYL INSECTICIDE, 07/10/1989

    EPA Pesticide Factsheets

    2011-04-14

    ... '('l~ i~'fdlU:tlOf: te. StJrfi(f~ ~~frf ~'~~~f spril~ rt£)d~fS ~~f CbJf(tIClni~!f. ~t hot kSf t~l! ~rodo(t JTI e,' en f:fctrHel f'Ql!J~Hf!t dCt: tv tt.E- pU:'~I~ilit,. ...

  18. Fire Resistant Fuel

    DTIC Science & Technology

    2011-12-15

    just two of the major UNCLASSIFIED UNCLASSIFIED vi technical hurdles that the FRF program was unable to clear. The logistical burden associated...Differences between JP-8 and DF-2 fuel are also discussed. The vehicle fuel fires experienced in combat situations occur in two distinct phases. The first...segregated in two groups, micro and macro-emulsions. These groups differ by the size of the suspended water droplets. Most of the emulsions

  19. A new procedure of modal parameter estimation for high-speed digital image correlation

    NASA Astrophysics Data System (ADS)

    Huňady, Róbert; Hagara, Martin

    2017-09-01

    The paper deals with the use of 3D digital image correlation in determining modal parameters of mechanical systems. It is a non-contact optical method, which for the measurement of full-field spatial displacements and strains of bodies uses precise digital cameras with high image resolution. Most often this method is utilized for testing of components or determination of material properties of various specimens. In the case of using high-speed cameras for measurement, the correlation system is capable of capturing various dynamic behaviors, including vibration. This enables the potential use of the mentioned method in experimental modal analysis. For that purpose, the authors proposed a measuring chain for the correlation system Q-450 and developed a software application called DICMAN 3D, which allows the direct use of this system in the area of modal testing. The created application provides the post-processing of measured data and the estimation of modal parameters. It has its own graphical user interface, in which several algorithms for the determination of natural frequencies, mode shapes and damping of particular modes of vibration are implemented. The paper describes the basic principle of the new estimation procedure which is crucial in the light of post-processing. Since the FRF matrix resulting from the measurement is usually relatively large, the estimation of modal parameters directly from the FRF matrix may be time-consuming and may occupy a large part of computer memory. The procedure implemented in DICMAN 3D provides a significant reduction in memory requirements and computational time while achieving a high accuracy of modal parameters. Its computational efficiency is particularly evident when the FRF matrix consists of thousands of measurement DOFs. The functionality of the created software application is presented on a practical example in which the modal parameters of a composite plate excited by an impact hammer were determined. For the verification of the obtained results a verification experiment was conducted during which the vibration responses were measured using conventional acceleration sensors. In both cases MIMO analysis was realized.

  20. A Custom Data Logger for Real-Time Remote Field Data Collections

    DTIC Science & Technology

    2017-03-01

    ERDC/CHL CHETN-VI-46 March 2017 Approved for public release; distribution is unlimited. A Custom Data Logger for Real- Time Remote Field Data...Field Research Facility (FRF), for remote real- time data collections. This custom data logger is compact and energy efficient but has the same...INTRODUCTION: Real- time data collections offer many advantages: 1. Instrument failures can be rapidly detected and repaired, thereby minimizing

  1. Experimental investigation of fan-folded piezoelectric energy harvesters for powering pacemakers

    PubMed Central

    Ansari, M H; Karami, M Amin

    2018-01-01

    This paper studies the fabrication and testing of a magnet free piezoelectric energy harvester (EH) for powering biomedical devices and sensors inside the body. The design for the EH is a fan-folded structure consisting of bimorph piezoelectric beams folding on top of each other. An actual size experimental prototype is fabricated to verify the developed analytical models. The model is verified by matching the analytical results of the tip acceleration frequency response functions (FRF) and voltage FRF with the experimental results. The generated electricity is measured when the EH is excited by the heartbeat. A closed loop shaker system is utilized to reproduce the heartbeat vibrations. Achieving low fundamental natural frequency is a key factor to generate sufficient energy for pacemakers using heartbeat vibrations. It is shown that the natural frequency of the small-scale device is less than 20 Hz due to its unique fan-folded design. The experimental results show that the small-scale EH generates sufficient power for state of the art pacemakers. The 1 cm3 EH with18.4 gr tip mass generates more than16 μW of power from a normal heartbeat waveform. The robustness of the device to the heart rate is also studied by measuring the relation between the power output and the heart rate. PMID:29674807

  2. Comparison of Moderate and High Energy of a Nano-Fractional Radiofrequency Treatment on a Photoaging Hairless Mice Model.

    PubMed

    Sun, Wenjia; Zhang, Chengfeng; Zhao, Juemin; Wu, Jiaqiang; Xiang, Leihong

    2018-04-01

    Fractional radiofrequency (FRF) has been widely used in skin rejuvenation. To explore optimal settings, it is important to compare different treatment parameters. This study was designed to compare the effect of moderate-energy and high-energy FRF treatment on a hairless mice model. Fifteen photoaged hairless mice were assigned to 3 groups: control, moderate energy, and high energy. Two treatment sessions (T × 1 and T × 2) were performed at 1-month interval. Transepidermal water loss was measured at baseline, immediately, 1, 2, and 4 weeks after T × 1. Skin samples were harvested before each treatment, 1 and 2 months after T × 2. Neocollagenesis was evaluated by hematoxylin and eosin staining, Masson staining, and immunohistochemistry analysis. Transepidermal water loss of high-energy group was significantly higher than the moderate-energy group (p = .008) immediately after T × 1. Remarkable fibroblast proliferation was observed at 1 month after T × 1, followed by significant dermal thickening, and increase of Type I collagen and Type III collagen. There was no significant difference between 2 energy groups in fibroblast proliferation, dermal thickness, and collagen density. The effect of moderate-energy treatment was comparable with that of high energy in neocollagenesis, whereas moderate energy yielded less damage to skin barrier function.

  3. Experimental investigation of fan-folded piezoelectric energy harvesters for powering pacemakers.

    PubMed

    Ansari, M H; Karami, M Amin

    2017-06-01

    This paper studies the fabrication and testing of a magnet free piezoelectric energy harvester (EH) for powering biomedical devices and sensors inside the body. The design for the EH is a fan-folded structure consisting of bimorph piezoelectric beams folding on top of each other. An actual size experimental prototype is fabricated to verify the developed analytical models. The model is verified by matching the analytical results of the tip acceleration frequency response functions (FRF) and voltage FRF with the experimental results. The generated electricity is measured when the EH is excited by the heartbeat. A closed loop shaker system is utilized to reproduce the heartbeat vibrations. Achieving low fundamental natural frequency is a key factor to generate sufficient energy for pacemakers using heartbeat vibrations. It is shown that the natural frequency of the small-scale device is less than 20 Hz due to its unique fan-folded design. The experimental results show that the small-scale EH generates sufficient power for state of the art pacemakers. The 1 cm 3 EH with18.4 gr tip mass generates more than16 μ W of power from a normal heartbeat waveform. The robustness of the device to the heart rate is also studied by measuring the relation between the power output and the heart rate.

  4. ABM Drag_Pass Report Generator

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.

  5. Improved tactile resonance sensor for robotic assisted surgery

    NASA Astrophysics Data System (ADS)

    Oliva Uribe, David; Schoukens, Johan; Stroop, Ralf

    2018-01-01

    This paper presents an improved tactile sensor using a piezoelectric bimorph able to differentiate soft materials with similar mechanical characteristics. The final aim is to develop intelligent surgical tools for brain tumour resection using integrated sensors in order to improve tissue tumour delineation and tissue differentiation. The bimorph sensor is driven using a random phase multisine and the properties of contact between the sensor's tip and a certain load are evaluated by means of the evaluation of the nonparametric FRF. An analysis of the nonlinear contributions is presented to show that the use of a linear model is feasible for the measurement conditions. A series of gelatine phantoms were tested. The tactile sensor is able to identify minimal differences in the consistency of the measured samples considering viscoelastic behaviour. A variance analysis was performed to evaluate the reliability of the sensors and to identify possible error sources due to inconsistencies in the preparation method of the phantoms. The results of the variance analysis are discussed showing that ability of the proposed tactile sensor to perform high quality measurements.

  6. Penn State University ground software support for X-ray missions.

    NASA Astrophysics Data System (ADS)

    Townsley, L. K.; Nousek, J. A.; Corbet, R. H. D.

    1995-03-01

    The X-ray group at Penn State is charged with two software development efforts in support of X-ray satellite missions. As part of the ACIS instrument team for AXAF, the authors are developing part of the ground software to support the instrument's calibration. They are also designing a translation program for Ginga data, to change it from the non-standard FRF format, which closely parallels the original telemetry format, to FITS.

  7. Field Research Facility Data Integration Framework Data Management Plan: Survey Lines Dataset

    DTIC Science & Technology

    2016-08-01

    CHL and its District partners. The beach morphology surveys on which this report focuses provide quantitative measures of the dynamic nature of...topography • volume change 1.4 Data description The morphology surveys are conducted over a series of 26 shore- perpendicular profile lines spaced 50...dataset input data and products. Table 1. FRF survey lines dataset input data and products. Input Data FDIF Product Description ASCII LARC survey text

  8. Preliminary Estimates of Frequency-Direction Spectra Derived from the Samson Pressure Gage Array, November 1990 to May 1991

    DTIC Science & Technology

    1991-09-01

    1990 TO MAY 1991 by Charles E. Long Coastal Engineering Research Center DEPARTMENT OF THE ARMY Waterways Experiment Station, Corps of Engineers 3909...Public Release; Distribution Unlimited Prepared for DEPARTMENT OF THE ARMY US Army Corps of Engineers Washington, DC 20314-1000 Under Civil Works...Institution of Oc anography at the Coastal Engineering Research Center (CERC) Field Research Facility (FRF) near Duck, NC, a two-dimensional array of 24

  9. Transient loads identification for a standoff metallic thermal protection system panel.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hundhausen, R. J.; Adams, Douglas E.; Derriso, Mark

    2004-01-01

    Standoff thermal protection system (TPS) panels are critical structural components in future aerospace vehicles because they protect the vehicle from the hostile environment encountered during space launch and reentry. Consequently, the panels are exposed to a variety of loads including high temperature thermal stresses, thermal shock, acoustic pressure, and foreign object impacts. Transient impacts are especially detrimental because they can cause immediate and severe degradation of the panel in the form of, for example, debonding and buckling of the face sheet, cracking of the fasteners, or deformation of the standoffs. Loads identification methods for determining the magnitude and location ofmore » impact loads provide an indication of TPS components that may be more susceptible to failure. Furthermore, a historical database of impact loads encountered can be retained for use in the development of statistical models that relate impact loading to panel life. In this work, simulated inservice transient loads are identified experimentally using two methods: a physics-based approach and an inverse Frequency Response Function (FRF) approach. It is shown that by applying the inverse FRF method, the location and magnitude of these simulated impacts can be identified with a high degree of accuracy. The identified force levels vary significantly with impact location due to the differences in panel deformation at the impact site indicating that resultant damage due to impacts would vary with location as well.« less

  10. Fire Resistant Fuel for Military Compression Ignition Engines

    DTIC Science & Technology

    2013-12-04

    Turbo Diesel Maximum Power Output Figure 5. 6.5L Turbo Diesel Maximum Torque Output 40 60 80 100 120 140 160 180 1000 1200 1400 1600 1800 2000 2200...H2O & 250ppm) JP8-FRF AMA (5% H2O & 250ppm) UNCLASSIFIED 9 UNCLASSIFIED Figure 6. 6.5L Turbo Diesel Brake Specific Fuel Consumption From...mid-1980s, fire-resistant diesel fuel that self extinguished when ignited by an explosive projectile was developed. Chemically, this fire resistant

  11. An alternative approach to measure similarity between two deterministic transient signals

    NASA Astrophysics Data System (ADS)

    Shin, Kihong

    2016-06-01

    In many practical engineering applications, it is often required to measure the similarity of two signals to gain insight into the conditions of a system. For example, an application that monitors machinery can regularly measure the signal of the vibration and compare it to a healthy reference signal in order to monitor whether or not any fault symptom is developing. Also in modal analysis, a frequency response function (FRF) from a finite element model (FEM) is often compared with an FRF from experimental modal analysis. Many different similarity measures are applicable in such cases, and correlation-based similarity measures may be most frequently used among these such as in the case where the correlation coefficient in the time domain and the frequency response assurance criterion (FRAC) in the frequency domain are used. Although correlation-based similarity measures may be particularly useful for random signals because they are based on probability and statistics, we frequently deal with signals that are largely deterministic and transient. Thus, it may be useful to develop another similarity measure that takes the characteristics of the deterministic transient signal properly into account. In this paper, an alternative approach to measure the similarity between two deterministic transient signals is proposed. This newly proposed similarity measure is based on the fictitious system frequency response function, and it consists of the magnitude similarity and the shape similarity. Finally, a few examples are presented to demonstrate the use of the proposed similarity measure.

  12. Converging Resonance Cones in the LAPTAG plasma

    NASA Astrophysics Data System (ADS)

    Katz, Cami; Ha, Chris; Gekelman, Walter; Pribyl, Patrick; Agmon, Nathan; Wise, Joe; Baker, Bob

    2013-10-01

    The LAPTAG laboratory is a high school outreach effort that has a 1.5m long 50 cm diameter magnetized plasma device. The plasma is produced by an ICP source (1X109 < n < 5X1011 cm-3) and has computer controlled data acquisition. Ring antennas are used to produce converging resonance cones. The experiment was performed in the quiescent plasma afterglow. The electrostatic cones were produced by rf applied to the rings (80 < f < 120 MHz), where fRF < f

  13. A User’s Guide to the Coastal Engineering Research Center’s (CERC’S) Field Research Facility.

    DTIC Science & Technology

    1985-05-01

    7 PART I: INTRODUCTION ............. ......................... 8 Use of the FRF .......... ......................... 12 Description of... INTRODUCTION 1. Federal interest in coastal engineering began in the 1920’s as a result of the increasing shoreline erosion along the recreational...lower at the pier end. As waves move 70 i- .r, N . SURVEY INTERVALS DAYS Jul 1. 1961 Jul 17 Aug 4 jAug 24 1Sep 19 Now 3 ’o1 en a 16Apr21I Jun?. to to to

  14. Spectral analysis for nonstationary and nonlinear systems: a discrete-time-model-based approach.

    PubMed

    He, Fei; Billings, Stephen A; Wei, Hua-Liang; Sarrigiannis, Ptolemaios G; Zhao, Yifan

    2013-08-01

    A new frequency-domain analysis framework for nonlinear time-varying systems is introduced based on parametric time-varying nonlinear autoregressive with exogenous input models. It is shown how the time-varying effects can be mapped to the generalized frequency response functions (FRFs) to track nonlinear features in frequency, such as intermodulation and energy transfer effects. A new mapping to the nonlinear output FRF is also introduced. A simulated example and the application to intracranial electroencephalogram data are used to illustrate the theoretical results.

  15. Applications of Scanning Tunneling Microscopy to Electrochemistry

    DTIC Science & Technology

    1988-10-28

    electrochemically pretreated platinum surfaces in air by baro and coworkers (68) and by Fan and bard (Fan, F-R.F.; Bard, A.J. Anl ._Chem,, submitted) have...W.V.; Coleman, R.V.; Drake, B.; Hansma, P.K. MXL. Rev, A 1986 4, 994-1005. 10. Smith, D.P.E.; Kirk, M.D.; Quate, C.F. - 1987, 8, 6034 -38. 11...Electroanal. Chem. 1988, 238, 9-31. 65. Wightman, R.M. Anl hm 1981, U3, lI26A-31A. 66. Gong. L.; Reed, R.A.; Longuire, N.; Murray, R.W. J. Phxs. Chem

  16. Motion artifact and background noise suppression on optical microangiography frames using a naïve Bayes mask.

    PubMed

    Reif, Roberto; Baran, Utku; Wang, Ruikang K

    2014-07-01

    Optical coherence tomography (OCT) is a technique that allows for the three-dimensional (3D) imaging of small volumes of tissue (a few millimeters) with high resolution (∼10  μm). Optical microangiography (OMAG) is a method of processing OCT data, which allows for the extraction of the tissue vasculature with capillary resolution from the OCT images. Cross-sectional B-frame OMAG images present the location of the patent blood vessels; however, the signal-to-noise-ratio of these images can be affected by several factors such as the quality of the OCT system and the tissue motion artifact. This background noise can appear in the en face projection view image. In this work we propose to develop a binary mask that can be applied on the cross-sectional B-frame OMAG images, which will reduce the background noise while leaving the signal from the blood vessels intact. The mask is created by using a naïve Bayes (NB) classification algorithm trained with a gold standard image which is manually segmented by an expert. The masked OMAG images present better contrast for binarizing the image and quantifying the result without the influence of noise. The results are compared with a previously developed frequency rejection filter (FRF) method which is applied on the en face projection view image. It is demonstrated that both the NB and FRF methods provide similar vessel length fractions. The advantage of the NB method is that the results are applicable in 3D and that its use is not limited to periodic motion artifacts.

  17. Modal identification of spindle-tool unit in high-speed machining

    NASA Astrophysics Data System (ADS)

    Gagnol, Vincent; Le, Thien-Phu; Ray, Pascal

    2011-10-01

    The accurate knowledge of high-speed motorised spindle dynamic behaviour during machining is important in order to ensure the reliability of machine tools in service and the quality of machined parts. More specifically, the prediction of stable cutting regions, which is a critical requirement for high-speed milling operations, requires the accurate estimation of tool/holder/spindle set dynamic modal parameters. These estimations are generally obtained through Frequency Response Function (FRF) measurements of the non-rotating spindle. However, significant changes in modal parameters are expected to occur during operation, due to high-speed spindle rotation. The spindle's modal variations are highlighted through an integrated finite element model of the dynamic high-speed spindle-bearing system, taking into account rotor dynamics effects. The dependency of dynamic behaviour on speed range is then investigated and determined with accuracy. The objective of the proposed paper is to validate these numerical results through an experiment-based approach. Hence, an experimental setup is elaborated to measure rotating tool vibration during the machining operation in order to determine the spindle's modal frequency variation with respect to spindle speed in an industrial environment. The identification of natural frequencies of the spindle under rotating conditions is challenging, due to the low number of sensors and the presence of many harmonics in the measured signals. In order to overcome these issues and to extract the characteristics of the system, the spindle modes are determined through a 3-step procedure. First, spindle modes are highlighted using the Frequency Domain Decomposition (FDD) technique, with a new formulation at the considered rotating speed. These extracted modes are then analysed through the value of their respective damping ratios in order to separate the harmonics component from structural spindle natural frequencies. Finally, the stochastic properties of the modes are also investigated by considering the probability density of the retained modes. Results show a good correlation between numerical and experiment-based identified frequencies. The identified spindle-tool modal properties during machining allow the numerical model to be considered as representative of the real dynamic properties of the system.

  18. STS-1 Pogo analysis

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Some of the pogo related data from STS-1 are documented. The measurements and data reduction are described. In the data analysis reference is made to FRF and single engine test results. The measurements are classified under major project elements of the space shuttle main engine, the external tank, and the orbiter. The subsystems are structural dynamics and main propulsion. Data were recorded onboard the orbiter with a minimum response rate of 1.5 to 50 Hz. The wideband, 14 track recorder was used, and the data required demultiplexing before reduction. The flight phase of interest was from liftoff through main engine cutoff.

  19. A 2-collinear-DoF strut with embedded negative-resistance electromagnetic shunt dampers for spacecraft micro-vibration

    NASA Astrophysics Data System (ADS)

    Stabile, Alessandro; Aglietti, Guglielmo S.; Richardson, Guy; Smet, Geert

    2017-04-01

    Micro-vibration on board a spacecraft is an important issue that affects payloads requiring high pointing accuracy. Although isolators have been extensively studied and implemented to tackle this issue, their application is far from being ideal due to the several drawbacks that they present, such as limited low-frequency attenuation for passive systems or high power consumption and reliability issues for active systems. In the present study, a novel 2-collinear-DoF strut with embedded electromagnetic shunt dampers (EMSD) is modelled, analysed and the concept is physically tested. The combination of high-inductance components and negative-resistance circuits is used in the two shunt circuits to improve the EMSD micro-vibration mitigation and to achieve an overall strut damping performance that is characterised by the elimination of the resonance peaks and a remarkable FRF final decay rate of -80 dB dec-1. The EMSD operates without requiring any control algorithm and can be comfortably integrated on a satellite due to the low power required, the simplified electronics and the small mass. This work demonstrates, both analytically and experimentally, that the proposed strut is capable of producing better isolation performance than other well-established damping solutions over the whole temperature range of interest.

  20. Investigation on active vibration isolation of a Stewart platform with piezoelectric actuators

    NASA Astrophysics Data System (ADS)

    Wang, Chaoxin; Xie, Xiling; Chen, Yanhao; Zhang, Zhiyi

    2016-11-01

    A Stewart platform with piezoelectric actuators is presented for micro-vibration isolation. The Jacobi matrix of the Stewart platform, which reveals the relationship between the position/pointing of the payload and the extensions of the six struts, is derived by kinematic analysis. The dynamic model of the Stewart platform is established by the FRF (frequency response function) synthesis method. In the active control loop, the direct feedback of integrated forces is combined with the FxLMS based adaptive feedback to dampen vibration of inherent modes and suppress transmission of periodic vibrations. Numerical simulations were conducted to prove vibration isolation performance of the Stewart platform under random and periodical disturbances, respectively. In the experiment, the output consistencies of the six piezoelectric actuators were measured at first and the theoretical Jacobi matrix as well as the feedback gain of each piezoelectric actuator was subsequently modified according to the measured consistencies. The direct feedback loop was adjusted to achieve sufficient active damping and the FxLMS based adaptive feedback control was adopted to suppress vibration transmission in the six struts. Experimental results have demonstrated that the Stewart platform can achieve 30 dB attenuation of periodical disturbances and 10-20 dB attenuation of random disturbances in the frequency range of 5-200 Hz.

  1. Subtyping cognitive profiles in Autism Spectrum Disorder using a Functional Random Forest algorithm.

    PubMed

    Feczko, E; Balba, N M; Miranda-Dominguez, O; Cordova, M; Karalunas, S L; Irwin, L; Demeter, D V; Hill, A P; Langhorst, B H; Grieser Painter, J; Van Santen, J; Fombonne, E J; Nigg, J T; Fair, D A

    2018-05-15

    DSM-5 Autism Spectrum Disorder (ASD) comprises a set of neurodevelopmental disorders characterized by deficits in social communication and interaction and repetitive behaviors or restricted interests, and may both affect and be affected by multiple cognitive mechanisms. This study attempts to identify and characterize cognitive subtypes within the ASD population using our Functional Random Forest (FRF) machine learning classification model. This model trained a traditional random forest model on measures from seven tasks that reflect multiple levels of information processing. 47 ASD diagnosed and 58 typically developing (TD) children between the ages of 9 and 13 participated in this study. Our RF model was 72.7% accurate, with 80.7% specificity and 63.1% sensitivity. Using the random forest model, the FRF then measures the proximity of each subject to every other subject, generating a distance matrix between participants. This matrix is then used in a community detection algorithm to identify subgroups within the ASD and TD groups, and revealed 3 ASD and 4 TD putative subgroups with unique behavioral profiles. We then examined differences in functional brain systems between diagnostic groups and putative subgroups using resting-state functional connectivity magnetic resonance imaging (rsfcMRI). Chi-square tests revealed a significantly greater number of between group differences (p < .05) within the cingulo-opercular, visual, and default systems as well as differences in inter-system connections in the somato-motor, dorsal attention, and subcortical systems. Many of these differences were primarily driven by specific subgroups suggesting that our method could potentially parse the variation in brain mechanisms affected by ASD. Copyright © 2017. Published by Elsevier Inc.

  2. Adaptation of multijoint coordination during standing balance in healthy young and healthy old individuals

    PubMed Central

    Pasma, J. H.; Schouten, A. C.; Aarts, R. G. K. M.; Meskers, C. G. M.; Maier, A. B.; van der Kooij, H.

    2015-01-01

    Standing balance requires multijoint coordination between the ankles and hips. We investigated how humans adapt their multijoint coordination to adjust to various conditions and whether the adaptation differed between healthy young participants and healthy elderly. Balance was disturbed by push/pull rods, applying two continuous and independent force disturbances at the level of the hip and between the shoulder blades. In addition, external force fields were applied, represented by an external stiffness at the hip, either stabilizing or destabilizing the participants' balance. Multivariate closed-loop system-identification techniques were used to describe the neuromuscular control mechanisms by quantifying the corrective joint torques as a response to body sway, represented by frequency response functions (FRFs). Model fits on the FRFs resulted in an estimation of time delays, intrinsic stiffness, reflexive stiffness, and reflexive damping of both the ankle and hip joint. The elderly generated similar corrective joint torques but had reduced body sway compared with the young participants, corresponding to the increased FRF magnitude with age. When a stabilizing or destabilizing external force field was applied at the hip, both young and elderly participants adapted their multijoint coordination by lowering or respectively increasing their neuromuscular control actions around the ankles, expressed in a change of FRF magnitude. However, the elderly adapted less compared with the young participants. Model fits on the FRFs showed that elderly had higher intrinsic and reflexive stiffness of the ankle, together with higher time delays of the hip. Furthermore, the elderly adapted their reflexive stiffness around the ankle joint less compared with young participants. These results imply that elderly were stiffer and were less able to adapt to external force fields. PMID:26719084

  3. Analogue modelling of thrust systems: Passive vs. active hanging wall strain accommodation and sharp vs. smooth fault-ramp geometries

    NASA Astrophysics Data System (ADS)

    Rosas, F. M.; Duarte, J. C.; Almeida, P.; Schellart, W. P.; Riel, N.; Terrinha, P.

    2017-06-01

    We present new analogue modelling results of crustal thrust-systems in which a deformable (brittle) hanging wall is assumed to endure passive internal deformation during thrusting, i.e. exclusively as a consequence of having to adapt its shape to the variable geometry of a rigid footwall. Building on previous experimental contributions, we specifically investigate the role of two so far overlooked critical variables: a) concave-convex (CC) vs. flat-ramp-flat (FRF) thrust ramp geometry; and b) presence vs. absence of a basal velocity discontinuity (VD). Regarding the first variable, we compare new results for considered (CC) smoother ramp types against classical experiments in which (FRF) sharp ramp geometries are always prescribed. Our results show that the considered sharp vs. smooth variation in the thrust-ramp geometry produces important differences in the distribution of the local stress field in the deformable hanging wall above both (lower and upper) fault bends, with corresponding styles of strain accommodation being expressed by marked differences in measured morpho-structural parameters. Regarding the second variable, we for the first time report analogue modelling results of this type of experiments in which basal VDs are experimentally prescribed to be absent. Our results critically show that true passive hanging wall deformation is only possible to simulate in the absence of any basal VD, since active shortening accommodation always necessarily occurs in the hanging wall above such a discontinuity (i.e. above the lower fault bend). In addition, we show that the morpho-structural configuration of model thrust-wedges formed for prescribed VD absence conditions complies well with natural examples of major overthrusts, wherein conditions must occur that approximate a frictionless state along the main basal thrust-plane.

  4. Substructure System Identification for Finite Element Model Updating

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.; Blades, Eric L.

    1997-01-01

    This report summarizes research conducted under a NASA grant on the topic 'Substructure System Identification for Finite Element Model Updating.' The research concerns ongoing development of the Substructure System Identification Algorithm (SSID Algorithm), a system identification algorithm that can be used to obtain mathematical models of substructures, like Space Shuttle payloads. In the present study, particular attention was given to the following topics: making the algorithm robust to noisy test data, extending the algorithm to accept experimental FRF data that covers a broad frequency bandwidth, and developing a test analytical model (TAM) for use in relating test data to reduced-order finite element models.

  5. Using frequency response functions to manage image degradation from equipment vibration in the Daniel K. Inouye Solar Telescope

    NASA Astrophysics Data System (ADS)

    McBride, William R.; McBride, Daniel R.

    2016-08-01

    The Daniel K Inouye Solar Telescope (DKIST) will be the largest solar telescope in the world, providing a significant increase in the resolution of solar data available to the scientific community. Vibration mitigation is critical in long focal-length telescopes such as the Inouye Solar Telescope, especially when adaptive optics are employed to correct for atmospheric seeing. For this reason, a vibration error budget has been implemented. Initially, the FRFs for the various mounting points of ancillary equipment were estimated using the finite element analysis (FEA) of the telescope structures. FEA analysis is well documented and understood; the focus of this paper is on the methods involved in estimating a set of experimental (measured) transfer functions of the as-built telescope structure for the purpose of vibration management. Techniques to measure low-frequency single-input-single-output (SISO) frequency response functions (FRF) between vibration source locations and image motion on the focal plane are described. The measurement equipment includes an instrumented inertial-mass shaker capable of operation down to 4 Hz along with seismic accelerometers. The measurement of vibration at frequencies below 10 Hz with good signal-to-noise ratio (SNR) requires several noise reduction techniques including high-performance windows, noise-averaging, tracking filters, and spectral estimation. These signal-processing techniques are described in detail.

  6. Normal response function method for mass and stiffness matrix updating using complex FRFs

    NASA Astrophysics Data System (ADS)

    Pradhan, S.; Modak, S. V.

    2012-10-01

    Quite often a structural dynamic finite element model is required to be updated so as to accurately predict the dynamic characteristics like natural frequencies and the mode shapes. Since in many situations undamped natural frequencies and mode shapes need to be predicted, it has generally been the practice in these situations to seek updating of only mass and stiffness matrix so as to obtain a reliable prediction model. Updating using frequency response functions (FRFs) has been one of the widely used approaches for updating, including updating of mass and stiffness matrices. However, the problem with FRF based methods, for updating mass and stiffness matrices, is that these methods are based on use of complex FRFs. Use of complex FRFs to update mass and stiffness matrices is not theoretically correct as complex FRFs are not only affected by these two matrices but also by the damping matrix. Therefore, in situations where updating of only mass and stiffness matrices using FRFs is required, the use of complex FRFs based updating formulation is not fully justified and would lead to inaccurate updated models. This paper addresses this difficulty and proposes an improved FRF based finite element model updating procedure using the concept of normal FRFs. The proposed method is a modified version of the existing response function method that is based on the complex FRFs. The effectiveness of the proposed method is validated through a numerical study of a simple but representative beam structure. The effect of coordinate incompleteness and robustness of method under presence of noise is investigated. The results of updating obtained by the improved method are compared with the existing response function method. The performance of the two approaches is compared for cases of light, medium and heavily damped structures. It is found that the proposed improved method is effective in updating of mass and stiffness matrices in all the cases of complete and incomplete data and with all levels and types of damping.

  7. Environmental forcing metrics to quantify short-term foredune morphodynamics

    NASA Astrophysics Data System (ADS)

    Spore, N.; Conery, I.; Brodie, K. L.; Palmsten, M.

    2016-12-01

    Coastal foredunes evolve continuously due to competing aeolian and hydrodynamic processes. Onshore to shore-parallel winds transport sand to the dune while storm-driven surge and wave runup remove sand from the dune. Dune-growth requires periods of time when the wind exceeds a threshold velocity to initiate transport and the relative geometry of the dry beach to the wind direction to create large fetches. This study aims to derive an aeolian transport potential (ATP) metric from the precipitation, available fetch (a function of wind angle and dry-beach width), and a threshold wind speed to initiate transport. ATP is then combined with a hydrodynamic transport potential (HTP) metric, defined as the number of hours of wave impact to the foredune or upper beach, to assess the time-dependent magnitude of the forcing factors affecting morphological evolution of the foredune between monthly terrestrial lidar surveys.This study focuses on two distinctly different dune fields and their frontal or incipient dune ridges in Duck, NC at the USACE Field Research Facility (FRF): (1) an undisturbed, tall and narrow recently impacted dune with a near vertical face; and (2) an undisturbed, shorter and wider dune with gentler and more hummocky slopes. The two sites are separated by < 1km alongshore and experience similar environmental forcings due to their close proximity. We used hourly precipitation, wind, wave, and imagery-derived runup data from the FRF and surrounding weather stations as inputs to ATP and HTP for each site. We scanned each site at monthly intervals for 18 months with high-resolution terrestrial lidar and generated 10 cm digital elevation models (DEM) for each scan. Incremental and cumulative changes in elevation, volume, and dune toe position were extracted from the DEMs and compared to the ATP and HTP values between the surveys to evaluate the dominant factors affecting sediment flux to the system.

  8. Sensitivity Analysis of Delft3d Simulations at Duck, NC, USA

    NASA Astrophysics Data System (ADS)

    Penko, A.; Boggs, S.; Palmsten, M.

    2017-12-01

    Our objective is to set up and test Delft3D, a high-resolution coupled wave and circulation model, to provide real-time nowcasts of hydrodynamics at Duck, NC, USA. Here, we test the sensitivity of the model to various parameters and boundary conditions. In order to validate the model simulations we compared the results to observational data. Duck, NC was chosen as our test site due to the extensive array of observational oceanographic, bathymetric, and meteorological data collected by the Army Corps of Engineers Field Research Facility (FRF). Observations were recorded with Acoustic Wave and Current meters (AWAC) at 6-m and 11-m depths as well as a 17-m depth Waverider buoy. The model is set up with an outer and inner nested domain. The outer grid extends 12-km in the along-shore and 3.5-km in the cross-shore with a 50-m resolution and a maximum depth of 17-m. Spectral wave measurements from the 17-m Waverider buoy drove Delft3D-WAVE in the outer grid. We compared the results of five outer grid simulations to wave and current observations collected at the FRF. The model simulations are then compared to the wave and current measurements collected at the 6-m and 11-m AWACs. To determine the best parameters and boundary conditions for the model set up at Duck, we calculated the root mean square error (RMSE) between the simulation results and the observations. Several conclusions were made: 1) The addition of astronomic tides have a significant effect on the circulation magnitude and direction, 2) incorporating an updated bathymetry in the bottom boundary condition has a small effect in shallower (<8-m) depths, 3) decreasing the wave bed friction by 50% did not affect the wave predictions and 4) the accuracy of the simulated wave heights improved as wind and wave forcing at the lateral boundaries were included.

  9. Multidirectional mobilities: Advanced measurement techniques and applications

    NASA Astrophysics Data System (ADS)

    Ivarsson, Lars Holger

    Today high noise-and-vibration comfort has become a quality sign of products in sectors such as the automotive industry, aircraft, components, households and manufacturing. Consequently, already in the design phase of products, tools are required to predict the final vibration and noise levels. These tools have to be applicable over a wide frequency range with sufficient accuracy. During recent decades a variety of tools have been developed such as transfer path analysis (TPA), input force estimation, substructuring, coupling by frequency response functions (FRF) and hybrid modelling. While these methods have a well-developed theoretical basis, their application combined with experimental data often suffers from a lack of information concerning rotational DOFs. In order to measure response in all 6 DOFs (including rotation), a sensor has been developed, whose special features are discussed in the thesis. This transducer simplifies the response measurements, although in practice the excitation of moments appears to be more difficult. Several excitation techniques have been developed to enable measurement of multidirectional mobilities. For rapid and simple measurement of the loaded mobility matrix, a MIMO (Multiple Input Multiple Output) technique is used. The technique has been tested and validated on several structures of different complexity. A second technique for measuring the loaded 6-by-6 mobility matrix has been developed. This technique employs a model of the excitation set-up, and with this model the mobility matrix is determined from sequential measurements. Measurements on ``real'' structures show that both techniques give results of similar quality, and both are recommended for practical use. As a further step, a technique for measuring the unloaded mobilities is presented. It employs the measured loaded mobility matrix in order to calculate compensation forces and moments, which are later applied in order to compensate for the loading of the measurement equipment. The developed measurement techniques have been used in a hybrid coupling of a plate-and-beam structure to study different aspects of the coupling technique. Results show that RDOFs are crucial and have to be included in this case. The importance of stiffness residuals when mobilities are estimated from modal superposition is demonstrated. Finally it is shown that proper curve fitting can correct errors from inconsistently measured data.

  10. Preset pivotal tool holder

    DOEpatents

    Asmanes, Charles

    1979-01-01

    A tool fixture is provided for precise pre-alignment of a radiused edge cutting tool in a tool holder relative to a fixed reference pivot point established on said holder about which the tool holder may be selectively pivoted relative to the fixture base member to change the contact point of the tool cutting edge with a workpiece while maintaining the precise same tool cutting radius relative to the reference pivot point.

  11. Fetch-Trapping in Hurricane Isabel

    NASA Astrophysics Data System (ADS)

    Pearse, A. J.; Hanson, J. L.

    2005-12-01

    Hurricane Isabel made landfall near Drum Inlet on the Outer Banks of North Carolina on September 18, 2003, and caused extensive monetary and coastal damage. Storm surge and battering waves were a primary cause of damage, as in most hurricanes. Data collected at the US Army Corps of Engineers Field Research Facility (FRF) in Duck, NC, the National Data Buoy Center (NDBC), and the Coastal Data Information Program (CDIP) suggest that the waves generated by Hurricane Isabel were larger and had longer periods than would be suggested by a traditional semi-empirical wave growth model with similar fetch and wind speed values. It is likely that this enhanced growth was due to the trapping of storm waves within the moving fetch of the hurricane. The purpose of this study was to empirically confirm the enhancement and to identify the degree of fetch-trapping that occurred. Directional wave spectra from 577 individual wave records were collected from buoys in three locations: CDIP station 078 in King's Bay, GA, the FRF Waverider in NC, and NDBC Station 44025 off Long Island, NY. A wave partitioning approach was used to isolate the individual swell components from the evolving wave field at each station. A backward raytrace along great-circle routes was employed to identify the intersection of each swell system with the official National Hurricane Center (NHC) Isabel track. This allowed matching each observed swell component with a generation time, storm translation speed, and peak wind speed. Wave period, rather than amplitude, was used in this study because amplitude is significantly affected by the bottom topography whereas period is conserved. Using the identified wind speeds and an average fetch of 200 km (approximated using NOAA wind field charts), the actual waves showed wave period enhancements up to 60% over predictions using the standard wave growth model. A variety of resonance criteria are applied to evaluate fetch trapping in Hurricane Isabel. The most enhanced wave periods were found to occur when the wave group speeds most closely matched the storm translation speeds, strongly suggesting that fetch trapping was an important mechanism for wave growth in Isabel.

  12. An identification method for damping ratio in rotor systems

    NASA Astrophysics Data System (ADS)

    Wang, Weimin; Li, Qihang; Gao, Jinji; Yao, Jianfei; Allaire, Paul

    2016-02-01

    Centrifugal compressor testing with magnetic bearing excitations is the last step to assure the compressor rotordynamic stability in the designed operating conditions. To meet the challenges of stability evaluation, a new method combining the rational polynomials method (RPM) with the weighted instrumental variables (WIV) estimator to fit the directional frequency response function (dFRF) is presented. Numerical simulation results show that the method suggested in this paper can identify the damping ratio of the first forward and backward modes with high accuracy, even in a severe noise environment. Experimental tests were conducted to study the effect of different bearing configurations on the stability of rotor. Furthermore, two example centrifugal compressors (a nine-stage straight-through and a six-stage back-to-back) were employed to verify the feasibility of identification method in industrial configurations as well.

  13. Identification of biomechanical nonlinearity in whole-body vibration using a reverse path multi-input-single-output method

    NASA Astrophysics Data System (ADS)

    Huang, Ya; Ferguson, Neil S.

    2018-04-01

    The study implements a classic signal analysis technique, typically applied to structural dynamics, to examine the nonlinear characteristics seen in the apparent mass of a recumbent person during whole-body horizontal random vibration. The nonlinearity in the present context refers to the amount of 'output' that is not correlated or coherent to the 'input', usually indicated by values of the coherence function that are less than unity. The analysis is based on the longitudinal horizontal inline and vertical cross-axis apparent mass of twelve human subjects exposed to 0.25-20 Hz random acceleration vibration at 0.125 and 1.0 ms-2 r.m.s. The conditioned reverse path frequency response functions (FRF) reveal that the uncorrelated 'linear' relationship between physical input (acceleration) and outputs (inline and cross-axis forces) has much greater variation around the primary resonance frequency between 0.5 and 5 Hz. By reversing the input and outputs of the physical system, it is possible to assemble additional mathematical inputs from the physical output forces and mathematical constructs (e.g. square root of inline force). Depending on the specific construct, this can improve the summed multiple coherence at frequencies where the response magnitude is low. In the present case this is between 6 and 20 Hz. The statistical measures of the response force time histories of each of the twelve subjects indicate that there are potential anatomical 'end-stops' for the sprung mass in the inline axis. No previous study has applied this reverse path multi-input-single-output approach to human vibration kinematic and kinetic data before. The implementation demonstrated in the present study will allow new and existing data to be examined using this different analytical tool.

  14. Conditioning of FRF measurements for use with frequency based substructuring

    NASA Astrophysics Data System (ADS)

    Nicgorski, Dana; Avitabile, Peter

    2010-02-01

    Frequency based substructuring approaches have been used for the generation of system models from component data. While numerical models show successful results, there have been many difficulties with actual measurements in many instances. Previous work has identified some of these typical problems using simulated data to incorporate specific measurement difficulties commonly observed along with approaches to overcome some of these difficulties. This paper presents the results using actual measured data for a laboratory structure subjected to both analytical and experimental studies. Various commonly used approaches are shown to illustrate some of the difficulties with measured data. A new approach to better condition the measured functions and purge commonly found data measurement contaminants is utilized to provide dramatically improved results. Several cases are explored to show the difficulties commonly observed as well as the improved conditioning of the measured data to obtain acceptable results.

  15. Brize Norton RAF UK. Revised Uniform Summary of Surface Weather Observations. Parts A-F.

    DTIC Science & Technology

    1987-11-01

    NORTON RAF UK MSC 036490 N 51 45 W 001 35 ELEV 285 FT EBVN PARTS A - F HOURS SUMMARIZED 0000 - 2300 LST PERIOD OF RECORD: HOURLY OBSERVATIONS: AUG 77...cccccc ppp p aa 8 *g"Afi *geaf frfT cce ccI 9 .8 8 tIt cc cpp r as 40 n Recc 999t 99 88 AA so O 11 9pppp,a& &a :"Not vicc pp UCE cc p A9 &a so SO Utif...E 3 i Ll . f T ( Ff I -’Otl 1L V OFSrOll AI 10I%" A Al’ 1 S PV-L’t/M AC II .". N1.1 fU : Z: ,,4 9 SIl I". NPtML .: R .- .4 RaF U4 pLnIOL OF P( COPD

  16. Damping Property and Vibration Analysis of Blades with Viscoelastic Layers

    NASA Astrophysics Data System (ADS)

    Huang, Shyh-Chin; Chiu, Yi-Jui; Lu, Yao-Ju

    This paper showed the damping effect and the vibration analysis of a shaft-disk-blade system with viscoelastic layers on blades. The focus of the research is on the shaft's torsional vibration and the blade's bending vibration. The equations of motion were derived from the energy approach. This model, unlike the previous, used only two displacement functions for layered blades. Then, the assumed-modes method was employed to discretize the equations. The analyses of natural frequencies damping property were discussed afterwards. The numerical results showed the damping effects due to various constraining layer (CL) thickness and viscoelastic material (VEM) thickness. The research also compared FRF's of the systems with and without viscoelastic layers. It is concluded that both CL and VEM layers promote the damping capability but the marginal effect decreases with their thickness. The CLD treatment also found drop the natural frequencies slightly.

  17. Point-cloud-to-point-cloud technique on tool calibration for dental implant surgical path tracking

    NASA Astrophysics Data System (ADS)

    Lorsakul, Auranuch; Suthakorn, Jackrit; Sinthanayothin, Chanjira

    2008-03-01

    Dental implant is one of the most popular methods of tooth root replacement used in prosthetic dentistry. Computerize navigation system on a pre-surgical plan is offered to minimize potential risk of damage to critical anatomic structures of patients. Dental tool tip calibrating is basically an important procedure of intraoperative surgery to determine the relation between the hand-piece tool tip and hand-piece's markers. With the transferring coordinates from preoperative CT data to reality, this parameter is a part of components in typical registration problem. It is a part of navigation system which will be developed for further integration. A high accuracy is required, and this relation is arranged by point-cloud-to-point-cloud rigid transformations and singular value decomposition (SVD) for minimizing rigid registration errors. In earlier studies, commercial surgical navigation systems from, such as, BrainLAB and Materialize, have flexibility problem on tool tip calibration. Their systems either require a special tool tip calibration device or are unable to change the different tool. The proposed procedure is to use the pointing device or hand-piece to touch on the pivot and the transformation matrix. This matrix is calculated every time when it moves to the new position while the tool tip stays at the same point. The experiment acquired on the information of tracking device, image acquisition and image processing algorithms. The key success is that point-to-point-cloud requires only 3 post images of tool to be able to converge to the minimum errors 0.77%, and the obtained result is correct in using the tool holder to track the path simulation line displayed in graphic animation.

  18. Breadth of Coverage, Ease of Use, and Quality of Mobile Point-of-Care Tool Information Summaries: An Evaluation

    PubMed Central

    Ren, Jinma

    2016-01-01

    Background With advances in mobile technology, accessibility of clinical resources at the point of care has increased. Objective The objective of this research was to identify if six selected mobile point-of-care tools meet the needs of clinicians in internal medicine. Point-of-care tools were evaluated for breadth of coverage, ease of use, and quality. Methods Six point-of-care tools were evaluated utilizing four different devices (two smartphones and two tablets). Breadth of coverage was measured using select International Classification of Diseases, Ninth Revision, codes if information on summary, etiology, pathophysiology, clinical manifestations, diagnosis, treatment, and prognosis was provided. Quality measures included treatment and diagnostic inline references and individual and application time stamping. Ease of use covered search within topic, table of contents, scrolling, affordance, connectivity, and personal accounts. Analysis of variance based on the rank of score was used. Results Breadth of coverage was similar among Medscape (mean 6.88), Uptodate (mean 6.51), DynaMedPlus (mean 6.46), and EvidencePlus (mean 6.41) (P>.05) with DynaMed (mean 5.53) and Epocrates (mean 6.12) scoring significantly lower (P<.05). Ease of use had DynaMedPlus with the highest score, and EvidencePlus was lowest (6.0 vs 4.0, respectively, P<.05). For quality, reviewers rated the same score (4.00) for all tools except for Medscape, which was rated lower (P<.05). Conclusions For breadth of coverage, most point-of-care tools were similar with the exception of DynaMed. For ease of use, only UpToDate and DynaMedPlus allow for search within a topic. All point-of-care tools have remote access with the exception of UpToDate and Essential Evidence Plus. All tools except Medscape covered criteria for quality evaluation. Overall, there was no significant difference between the point-of-care tools with regard to coverage on common topics used by internal medicine clinicians. Selection of point-of-care tools is highly dependent on individual preference based on ease of use and cost of the application. PMID:27733328

  19. A critical review of seven selected neighborhood sustainability assessment tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Ayyoob, E-mail: sharifi.ayyoob@a.mbox.nagoya-u.ac.jp; Murayama, Akito, E-mail: murayama@corot.nuac.nagoya-u.ac.jp

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The resultsmore » of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.« less

  20. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  1. Evaluating Gaze-Based Interface Tools to Facilitate Point-and-Select Tasks with Small Targets

    ERIC Educational Resources Information Center

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. less than 12 x 12 pixels) point-and-select tasks. We conducted two…

  2. The Python Spectral Analysis Tool (PySAT) for Powerful, Flexible, and Easy Preprocessing and Machine Learning with Point Spectral Data

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2018-04-01

    The PySAT point spectra tool provides a flexible graphical interface, enabling scientists to apply a wide variety of preprocessing and machine learning methods to point spectral data, with an emphasis on multivariate regression.

  3. Detection of Earthquake-Induced Damage in a Framed Structure Using a Finite Element Model Updating Procedure

    PubMed Central

    Kim, Seung-Nam; Park, Taewon; Lee, Sang-Hyun

    2014-01-01

    Damage of a 5-story framed structure was identified from two types of measured data, which are frequency response functions (FRF) and natural frequencies, using a finite element (FE) model updating procedure. In this study, a procedure to determine the appropriate weightings for different groups of observations was proposed. In addition, a modified frame element which included rotational springs was used to construct the FE model for updating to represent concentrated damage at the member ends (a formulation for plastic hinges in framed structures subjected to strong earthquakes). The results of the model updating and subsequent damage detection when the rotational springs (RS model) were used were compared with those obtained using the conventional frame elements (FS model). Comparisons indicated that the RS model gave more accurate results than the FS model. That is, the errors in the natural frequencies of the updated models were smaller, and the identified damage showed clearer distinctions between damaged and undamaged members and was more consistent with observed damage. PMID:24574888

  4. 3D fluid-structure modelling and vibration analysis for fault diagnosis of Francis turbine using multiple ANN and multiple ANFIS

    NASA Astrophysics Data System (ADS)

    Saeed, R. A.; Galybin, A. N.; Popov, V.

    2013-01-01

    This paper discusses condition monitoring and fault diagnosis in Francis turbine based on integration of numerical modelling with several different artificial intelligence (AI) techniques. In this study, a numerical approach for fluid-structure (turbine runner) analysis is presented. The results of numerical analysis provide frequency response functions (FRFs) data sets along x-, y- and z-directions under different operating load and different position and size of faults in the structure. To extract features and reduce the dimensionality of the obtained FRF data, the principal component analysis (PCA) has been applied. Subsequently, the extracted features are formulated and fed into multiple artificial neural networks (ANN) and multiple adaptive neuro-fuzzy inference systems (ANFIS) in order to identify the size and position of the damage in the runner and estimate the turbine operating conditions. The results demonstrated the effectiveness of this approach and provide satisfactory accuracy even when the input data are corrupted with certain level of noise.

  5. New Experiments and a Model-Driven Approach for Interpreting Middle Stone Age Lithic Point Function Using the Edge Damage Distribution Method.

    PubMed

    Schoville, Benjamin J; Brown, Kyle S; Harris, Jacob A; Wilkins, Jayne

    2016-01-01

    The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages-Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.

  6. New Experiments and a Model-Driven Approach for Interpreting Middle Stone Age Lithic Point Function Using the Edge Damage Distribution Method

    PubMed Central

    Schoville, Benjamin J.; Brown, Kyle S.; Harris, Jacob A.; Wilkins, Jayne

    2016-01-01

    The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed. PMID:27736886

  7. SMART-COP: a tool for predicting the need for intensive respiratory or vasopressor support in community-acquired pneumonia.

    PubMed

    Charles, Patrick G P; Wolfe, Rory; Whitby, Michael; Fine, Michael J; Fuller, Andrew J; Stirling, Robert; Wright, Alistair A; Ramirez, Julio A; Christiansen, Keryn J; Waterer, Grant W; Pierce, Robert J; Armstrong, John G; Korman, Tony M; Holmes, Peter; Obrosky, D Scott; Peyrani, Paula; Johnson, Barbara; Hooy, Michelle; Grayson, M Lindsay

    2008-08-01

    Existing severity assessment tools, such as the pneumonia severity index (PSI) and CURB-65 (tool based on confusion, urea level, respiratory rate, blood pressure, and age >or=65 years), predict 30-day mortality in community-acquired pneumonia (CAP) and have limited ability to predict which patients will require intensive respiratory or vasopressor support (IRVS). The Australian CAP Study (ACAPS) was a prospective study of 882 episodes in which each patient had a detailed assessment of severity features, etiology, and treatment outcomes. Multivariate logistic regression was performed to identify features at initial assessment that were associated with receipt of IRVS. These results were converted into a simple points-based severity tool that was validated in 5 external databases, totaling 7464 patients. In ACAPS, 10.3% of patients received IRVS, and the 30-day mortality rate was 5.7%. The features statistically significantly associated with receipt of IRVS were low systolic blood pressure (2 points), multilobar chest radiography involvement (1 point), low albumin level (1 point), high respiratory rate (1 point), tachycardia (1 point), confusion (1 point), poor oxygenation (2 points), and low arterial pH (2 points): SMART-COP. A SMART-COP score of >or=3 points identified 92% of patients who received IRVS, including 84% of patients who did not need immediate admission to the intensive care unit. Accuracy was also high in the 5 validation databases. Sensitivities of PSI and CURB-65 for identifying the need for IRVS were 74% and 39%, respectively. SMART-COP is a simple, practical clinical tool for accurately predicting the need for IRVS that is likely to assist clinicians in determining CAP severity.

  8. A survey of ground operations tools developed to simulate the pointing of space telescopes and the design for WISE

    NASA Technical Reports Server (NTRS)

    Fabinsky, Beth

    2006-01-01

    WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4

    DTIC Science & Technology

    2005-04-01

    older automated cost- estimating tools are no longer being actively marketed but are still in use such as CheckPoint, COCOMO, ESTIMACS, REVIC, and SPQR ...estimation tools: SPQR /20, Checkpoint, and Knowl- edgePlan. These software estimation tools pioneered the use of function point metrics for sizing and

  10. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  11. Bathymetric Changes Shaped by Longshore Currents on a Natural Beach

    NASA Astrophysics Data System (ADS)

    Reilly, W. L.; Slinn, D.; Plant, N.

    2004-12-01

    The goal of the project is to simulate beach morphology on time scales of hours to days. Our approach is to develop finite difference solutions from a coupled modeling system consisting of existing nearshore circulation, wave, and sediment flux models. We initialize the model with bathymetry from a dense data set north of the pier at the Field Research Facility (FRF) in Duck, NC. We integrate the model system forward in time and compare the results of the hind-cast of the beach evolution with the field observations. The model domain extends 1000 meters in the alongshore direction and 500 meters in the cross-shore direction with 5 meter grid spacing. The bathymetry is interpolated and filtered from CRAB transects. A second-degree exponential smoothing method is used to return the cross-shore beach profile near the edges of the modeled domain back to the mean alongshore profile, because the circulation model implements periodic boundary conditions in the alongshore direction. The offshore wave height and direction are taken from the 8-meter bipod at the FRF and input to the wave-model, SWAN (Spectral Wave Nearshore), with a Gaussian-shaped frequency spectrum and a directional spreading of 5 degrees. A constant depth induced wave breaking parameter of 0.73 is used. The resulting calculated wave induced force per unit surface area (gradient of the radiation stress) output from SWAN is used to drive the currents in the circulation model. The circulation model is based on the free-surface non-linear shallow water equations and uses the fourth order compact scheme to calculate spatial derivatives and a third order Adams-Bashforth time discretization scheme. Free slip, symmetry boundary conditions are applied at both the shoreline and offshore boundaries. The time averaged sediment flux is calculated at each location after one hour of circulation. The sediment flux model is based on the approach of Bagnold and includes approximations for both bed-load and suspended load. The bathymetry is then updated by computing the divergence of the time averaged sediment fluxes. The process is then repeated using the updated bathymetry in both SWAN and the circulation model. The cycle continues for a simulation of 10 hours. The results of bathymetric change vary for different time-dependent wave conditions and initial bathymetric profiles. Typical results indicate that for wave heights on the order of one meter, shoreline advancement and sandbar evolution is observed on the order of tens of centimeters.

  12. An ultra-precision tool nanoindentation instrument for replication of single point diamond tool cutting edges

    NASA Astrophysics Data System (ADS)

    Cai, Yindi; Chen, Yuan-Liu; Xu, Malu; Shimizu, Yuki; Ito, So; Matsukuma, Hiraku; Gao, Wei

    2018-05-01

    Precision replication of the diamond tool cutting edge is required for non-destructive tool metrology. This paper presents an ultra-precision tool nanoindentation instrument designed and constructed for replication of the cutting edge of a single point diamond tool onto a selected soft metal workpiece by precisely indenting the tool cutting edge into the workpiece surface. The instrument has the ability to control the indentation depth with a nanometric resolution, enabling the replication of tool cutting edges with high precision. The motion of the diamond tool along the indentation direction is controlled by the piezoelectric actuator of a fast tool servo (FTS). An integrated capacitive sensor of the FTS is employed to detect the displacement of the diamond tool. The soft metal workpiece is attached to an aluminum cantilever whose deflection is monitored by another capacitive sensor, referred to as an outside capacitive sensor. The indentation force and depth can be accurately evaluated from the diamond tool displacement, the cantilever deflection and the cantilever spring constant. Experiments were carried out by replicating the cutting edge of a single point diamond tool with a nose radius of 2.0 mm on a copper workpiece surface. The profile of the replicated tool cutting edge was measured using an atomic force microscope (AFM). The effectiveness of the instrument in precision replication of diamond tool cutting edges is well-verified by the experimental results.

  13. Robotic tool positioning process using a multi-line off-axis laser triangulation sensor

    NASA Astrophysics Data System (ADS)

    Pinto, T. C.; Matos, G.

    2018-03-01

    Proper positioning of a friction stir welding head for pin insertion, driven by a closed chain robot, is important to ensure quality repair of cracks. A multi-line off-axis laser triangulation sensor was designed to be integrated to the robot, allowing relative measurements of the surface to be repaired. This work describes the sensor characteristics, its evaluation and the measurement process for tool positioning to a surface point of interest. The developed process uses a point of interest image and a measured point cloud to define the translation and rotation for tool positioning. Sensor evaluation and tests are described. Keywords: laser triangulation, 3D measurement, tool positioning, robotics.

  14. BASINs and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & Non-point Sources (BASINS) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT). The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential effects of climate change on streamflow and water quality. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  15. X-ray mask and method for making

    DOEpatents

    Morales, Alfredo M.

    2004-10-26

    The present invention describes a method for fabricating an x-ray mask tool which is a contact lithographic mask which can provide an x-ray exposure dose which is adjustable from point-to-point. The tool is useful in the preparation of LIGA plating molds made from PMMA, or similar materials. In particular the tool is useful for providing an ability to apply a graded, or "stepped" x-ray exposure dose across a photosensitive substrate. By controlling the x-ray radiation dose from point-to-point, it is possible to control the development process for removing exposed portions of the substrate; adjusting it such that each of these portions develops at a more or less uniformly rate regardless of feature size or feature density distribution.

  16. Cutting tool form compensation system and method

    DOEpatents

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  17. Cutting tool form compensaton system and method

    DOEpatents

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  18. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  19. Beyond Jeopardy and Lectures: Using "Microsoft PowerPoint" as a Game Design Tool to Teach Science

    ERIC Educational Resources Information Center

    Siko, Jason; Barbour, Michael; Toker, Sacip

    2011-01-01

    To date, research involving homemade PowerPoint games as an instructional tool has not shown statistically significant gains in student performance. This paper examines the results of a study comparing the performance of students in a high school chemistry course who created homemade PowerPoint games as a test review with the students who used a…

  20. Lathe leveler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovelady, III, Michael W.J.

    A lathe leveler for centering a cutting tool in relation to a cylindrical work piece includes a first leveling arm having a first contact point disposed adjacent a distal end of the first leveling arm, a second leveling arm having a second contact point disposed adjacent a distal end of the second leveling arm, a leveling gage, and a leveling plate having a cutting tool receiving surface positioned parallel to a horizontal axis of the leveling gage and on a same plane as a midpoint of the first contact point and the second contact point. The leveling arms and levelingmore » plate are dimensioned and configured such that the cutting tool receiving surface is centered in relation to the work piece when the first and second contact points are in contact with one of the inner surface and outer surface of the cylindrical work piece and the leveling gage is centered.« less

  1. There Is More than One Way to Crack an Oyster: Identifying Variation in Burmese Long-Tailed Macaque (Macaca fascicularis aurea) Stone-Tool Use

    PubMed Central

    Tan, Amanda; Tan, Say Hoon; Vyas, Dhaval; Malaivijitnond, Suchinda; Gumert, Michael D.

    2015-01-01

    We explored variation in patterns of percussive stone-tool use on coastal foods by Burmese long-tailed macaques (Macaca fascicularis aurea) from two islands in Laem Son National Park, Ranong, Thailand. We catalogued variation into three hammering classes and 17 action patterns, after examining 638 tool-use bouts across 90 individuals. Hammering class was based on the stone surface used for striking food, being face, point, and edge hammering. Action patterns were discriminated by tool material, hand use, posture, and striking motion. Hammering class was analyzed for associations with material and behavioural elements of tool use. Action patterns were not, owing to insufficient instances of most patterns. We collected 3077 scan samples from 109 macaques on Piak Nam Yai Island’s coasts, to determine the proportion of individuals using each hammering class and action pattern. Point hammering was significantly more associated with sessile foods, smaller tools, faster striking rates, smoother recoil, unimanual use, and more varied striking direction, than were face and edge hammering, while both point and edge hammering were significantly more associated with precision gripping than face hammering. Edge hammering also showed distinct differences depending on whether such hammering was applied to sessile or unattached foods, resembling point hammering for sessile foods and face hammering for unattached foods. Point hammering and sessile edge hammering compared to prior descriptions of axe hammering, while face and unattached edge hammering compared to pound hammering. Analysis of scans showed that 80% of individuals used tools, each employing one to four different action patterns. The most common patterns were unimanual point hammering (58%), symmetrical-bimanual face hammering (47%) and unimanual face hammering (37%). Unimanual edge hammering was relatively frequent (13%), compared to the other thirteen rare action patterns (<5%). We compare our study to other stone-using primates, and discuss implications for further research. PMID:25970286

  2. Handheld tools assess medical necessity at the point of care.

    PubMed

    Pollard, Dan

    2002-01-01

    An emerging strategy to manage financial risk in clinical practice is to involve the physician at the point of care. Using handheld technology, encounter-specific information along with medical necessity policy can be presented to physicians allowing them to integrate it into their medical decision-making process. Three different strategies are discussed: reference books or paper encounter forms, electronic reference tools, and integrated process tools. The electronic reference tool strategy was evaluated and showed a return on investment exceeding 1200% due to reduced overhead costs associated with rework of claim errors.

  3. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  4. Acoustic emission from single point machining: Part 2, Signal changes with tool wear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heiple, C.R.; Carpenter, S.H.; Armentrout, D.L.

    1989-01-01

    Changes in acoustic emission signal characteristics with tool wear were monitored during single point machining of 4340 steel and Ti-6Al-4V heat treated to several strength levels, 606l-T6 aluminum, 304 stainless steel, 17-4PH stainless steel, 410 stainless steel, lead, and teflon. No signal characteristic changed in the same way with tool wear for all materials tested. A single change in a particular AE signal characteristic with tool wear valid for all materials probably does not exist. Nevertheless, changes in various signal characteristic with wear for a given material may be sufficient to be used to monitor tool wear.

  5. Application of a Phase-resolving, Directional Nonlinear Spectral Wave Model

    NASA Astrophysics Data System (ADS)

    Davis, J. R.; Sheremet, A.; Tian, M.; Hanson, J. L.

    2014-12-01

    We describe several applications of a phase-resolving, directional nonlinear spectral wave model. The model describes a 2D surface gravity wave field approaching a mildly sloping beach with parallel depth contours at an arbitrary angle accounting for nonlinear, quadratic triad interactions. The model is hyperbolic, with the initial wave spectrum specified in deep water. Complex amplitudes are generated based on the random phase approximation. The numerical implementation includes unidirectional propagation as a special case. In directional mode, it solves the system of equations in the frequency-alongshore wave number space. Recent enhancements of the model include the incorporation of dissipation caused by breaking and propagation over a viscous mud layer and the calculation of wave induced setup. Applications presented include: a JONSWAP spectrum with a cos2s directional distribution, for shore-perpendicular and oblique propagation, a study of the evolution of a single directional triad, and several preliminary comparisons to wave spectra collected at the USACE-FRF in Duck, NC which show encouraging results although further validation with a wider range of beach slopes and wave conditions is needed.

  6. [A novel method for extracting leaf-level solar-induced fluorescence of typical crops under Cu stress].

    PubMed

    Qu, Ying; Liu, Su-hong; Li, Xiao-wen

    2012-05-01

    The leaf-level solar-induced fluorescence changes when the typical crops are under Cu stress, which can be considered as a sensitive indicator to estimate the stress level. In the present study, wheat (Triticum aestivum L.), pea (Pisum sativum L.) and Chinese cabbage (Brassica campestris L.) were selected and cultured with copper solutions or copper polluted soil with different Cu stress. The apparent reflectance of leaves was measured by an ASD Fieldspec spectrometer and an integrating sphere. As the apparent reflectance was seldom affected by the fluorescence emission at 580-650 and 800-1000 nm, so the apparent solar-induced fluorescence can be separated from the apparent reflectance based on PROSPECT model. The re-absorption effect of chlorophyll was corrected by three methods, called GM (Gitelson et al.'s model), AM (Agati et al.'s model) and LM (Lagorio et al.'s model). After the re-absorption correction, the solar-induced fluorescence under different Cu stress was obtained, and a positive relationship was found between the height of far RED fluorescence (FRF) and the copper contents in leaves.

  7. Investigation of Concrete Floor Vibration Using Heel-Drop Test

    NASA Astrophysics Data System (ADS)

    Azaman, N. A. Mohd; Ghafar, N. H. Abd; Azhar, A. F.; Fauzi, A. A.; Ismail, H. A.; Syed Idrus, S. S.; Mokhjar, S. S.; Hamid, F. F. Abd

    2018-04-01

    In recent years, there is an increased in floor vibration problems of structures like residential and commercial building. Vibration is defined as a serviceability issue related to the comfort of the occupant or damage equipment. Human activities are the main source of vibration in the building and it could affect the human comfort and annoyance of residents in the building when the vibration exceed the recommend level. A new building, Madrasah Tahfiz located at Yong Peng have vibration problem when load subjected on the first floor of the building. However, the limitation of vibration occurs on building is unknown. Therefore, testing is needed to determine the vibration behaviour (frequency, damping ratio and mode shape) of the building. Heel-drop with pace 2Hz was used in field measurement to obtain the vibration response. Since, the heel-drop test results would vary in light of person performance, test are carried out three time to reduce uncertainty. Natural frequency from Frequency Response Function analysis (FRF) is 17.4Hz, 16.8, 17.4Hz respectively for each test.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haapio, Appu, E-mail: appu.haapio@vtt.fi

    Requirements for the assessment tools of buildings have increased, assessing of building components or separate buildings is not enough. Neighbourhoods, built environment, public transportations, and services, should be considered simultaneously. Number of population living in urban areas is high and increasing rapidly. Urbanisation is a major concern due to its detrimental effects on the environment. The aim of this study is to clarify the field of assessment tools for urban communities by analysing the current situation. The focus is on internationally well known assessment tools; BREEAM Communities, CASBEE for Urban Development and LEED for Neigborhood Development. The interest towards certificationmore » systems is increasing amongst the authorities, and especially amongst the global investors and property developers. Achieved certifications are expected to bring measureable publicity for the developers. The assessment of urban areas enables the comparison of municipalities and urban areas, and notably supports decision making processes. Authorities, city planners, and designers would benefit most from the use of the tools during the decision making process. - Highlights: Black-Right-Pointing-Pointer The urban assessment tools have strong linkage to the region. Black-Right-Pointing-Pointer The tools promote complementary building and retrofitting existing sites. Black-Right-Pointing-Pointer Sharing knowledge and experiences is important in the development of the tools.« less

  9. Python Spectral Analysis Tool (PySAT) for Preprocessing, Multivariate Analysis, and Machine Learning with Point Spectra

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.

    2017-06-01

    We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.

  10. Acoustic emission from single point machining: Part 2, Signal changes with tool wear. Revised

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heiple, C.R.; Carpenter, S.H.; Armentrout, D.L.

    1989-12-31

    Changes in acoustic emission signal characteristics with tool wear were monitored during single point machining of 4340 steel and Ti-6Al-4V heat treated to several strength levels, 606l-T6 aluminum, 304 stainless steel, 17-4PH stainless steel, 410 stainless steel, lead, and teflon. No signal characteristic changed in the same way with tool wear for all materials tested. A single change in a particular AE signal characteristic with tool wear valid for all materials probably does not exist. Nevertheless, changes in various signal characteristic with wear for a given material may be sufficient to be used to monitor tool wear.

  11. On constraining pilot point calibration with regularization in PEST

    USGS Publications Warehouse

    Fienen, M.N.; Muffels, C.T.; Hunt, R.J.

    2009-01-01

    Ground water model calibration has made great advances in recent years with practical tools such as PEST being instrumental for making the latest techniques available to practitioners. As models and calibration tools get more sophisticated, however, the power of these tools can be misapplied, resulting in poor parameter estimates and/or nonoptimally calibrated models that do not suit their intended purpose. Here, we focus on an increasingly common technique for calibrating highly parameterized numerical models - pilot point parameterization with Tikhonov regularization. Pilot points are a popular method for spatially parameterizing complex hydrogeologic systems; however, additional flexibility offered by pilot points can become problematic if not constrained by Tikhonov regularization. The objective of this work is to explain and illustrate the specific roles played by control variables in the PEST software for Tikhonov regularization applied to pilot points. A recent study encountered difficulties implementing this approach, but through examination of that analysis, insight into underlying sources of potential misapplication can be gained and some guidelines for overcoming them developed. ?? 2009 National Ground Water Association.

  12. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools.

    PubMed

    Shurtz, Suzanne; Foster, Margaret J

    2011-07-01

    The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.

  13. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  14. Predictive validity of the identification of seniors at risk screening tool in a German emergency department setting.

    PubMed

    Singler, Katrin; Heppner, Hans Jürgen; Skutetzky, Andreas; Sieber, Cornel; Christ, Michael; Thiem, Ulrich

    2014-01-01

    The identification of patients at high risk for adverse outcomes [death, unplanned readmission to emergency department (ED)/hospital, functional decline] plays an important role in emergency medicine. The Identification of Seniors at Risk (ISAR) instrument is one of the most commonly used and best-validated screening tools. As to the authors' knowledge so far there are no data on any screening tool for the identification of older patients at risk for a negative outcome in Germany. To evaluate the validity of the ISAR screening tool in a German ED. This was a prospective single-center observational cohort study in an ED of an urban university-affiliated hospital. Participants were 520 patients aged ≥75 years consecutively admitted to the ED. The German version of the ISAR screening tool was administered directly after triage of the patients. Follow-up telephone interviews to assess outcome variables were conducted 28 and 180 days after the index visit in the ED. The primary end point was death from any cause or hospitalization or recurrent ED visit or change of residency into a long-term care facility on day 28 after the index ED visit. The mean age ± SD was 82.8 ± 5.0 years. According to ISAR, 425 patients (81.7%) scored ≥2 points, and 315 patients (60.5%) scored ≥3 points. The combined primary end point was observed in 250 of 520 patients (48.1%) on day 28 and in 260 patients (50.0%) on day 180. Using a continuous ISAR score the area under the curve on day 28 was 0.621 (95% confidence interval, CI 0.573-0.669) and 0.661 (95% CI 0.615-0.708) on day 180, respectively. The German version of the ISAR screening tool acceptably identified elderly patients in the ED with an increased risk of a negative outcome. Using the cutoff ≥3 points instead of ≥2 points yielded better overall results.

  15. An integrated set of UNIX based system tools at control room level

    NASA Astrophysics Data System (ADS)

    Potepan, F.; Scafuri, C.; Bortolotto, C.; Surace, G.

    1994-12-01

    The design effort of providing a simple point-and-click approach to the equipment access has led to the definition and realization of a modular set of software tools to be used at the ELETTRA control room level. Point-to-point equipment access requires neither programming nor specific knowledge of the control system architecture. The development and integration of communication, graphic, editing and global database modules are described in depth, followed by a report of their use in the first commissioning period.

  16. Perpetual Points: New Tool for Localization of Coexisting Attractors in Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Dudkowski, Dawid; Prasad, Awadhesh; Kapitaniak, Tomasz

    Perpetual points (PPs) are special critical points for which the magnitude of acceleration describing the dynamics drops to zero, while the motion is still possible (stationary points are excluded), e.g. considering the motion of the particle in the potential field, at perpetual point, it has zero acceleration and nonzero velocity. We show that using PPs we can trace all the stable fixed points in the system, and that the structure of trajectories leading from former points to stable equilibria may be similar to orbits obtained from unstable stationary points. Moreover, we argue that the concept of perpetual points may be useful in tracing unexpected attractors (hidden or rare attractors with small basins of attraction). We show potential applicability of this approach by analyzing several representative systems of physical significance, including the damped oscillator, pendula, and the Henon map. We suggest that perpetual points may be a useful tool for localizing coexisting attractors in dynamical systems.

  17. Multisurface fixture permits easy grinding of tool bit angles

    NASA Technical Reports Server (NTRS)

    Jones, C. R.

    1966-01-01

    Multisurface fixture with a tool holder permits accurate grinding and finishing of right and left hand single point threading tools. All angles are ground by changing the fixture position to rest at various references angles without removing the tool from the holder.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, R; Zhu, X; Li, S

    Purpose: High Dose Rate (HDR) brachytherapy forward planning is principally an iterative process; hence, plan quality is affected by planners’ experiences and limited planning time. Thus, this may lead to sporadic errors and inconsistencies in planning. A statistical tool based on previous approved clinical treatment plans would help to maintain the consistency of planning quality and improve the efficiency of second checking. Methods: An independent dose calculation tool was developed from commercial software. Thirty-three previously approved cervical HDR plans with the same prescription dose (550cGy), applicator type, and treatment protocol were examined, and ICRU defined reference point doses (bladder, vaginalmore » mucosa, rectum, and points A/B) along with dwell times were collected. Dose calculation tool then calculated appropriate range with a 95% confidence interval for each parameter obtained, which would be used as the benchmark for evaluation of those parameters in future HDR treatment plans. Model quality was verified using five randomly selected approved plans from the same dataset. Results: Dose variations appears to be larger at the reference point of bladder and mucosa as compared with rectum. Most reference point doses from verification plans fell between the predicted range, except the doses of two points of rectum and two points of reference position A (owing to rectal anatomical variations & clinical adjustment in prescription points, respectively). Similar results were obtained for tandem and ring dwell times despite relatively larger uncertainties. Conclusion: This statistical tool provides an insight into clinically acceptable range of cervical HDR plans, which could be useful in plan checking and identifying potential planning errors, thus improving the consistency of plan quality.« less

  19. Managing the Art Room/Tools and Equipment Use.

    ERIC Educational Resources Information Center

    Qualley, Charles A.

    1979-01-01

    The author looks at the different tools and processes used in the art classroom, pointing out areas of safety concern, and suggests tool maintenance and use standards which can prevent classroom accidents. (SJL)

  20. Hole-Center Locating Tool

    NASA Technical Reports Server (NTRS)

    Senter, H. F.

    1984-01-01

    Tool alines center of new hold with existing hole. Tool marks center of new hole drilled while workpiece is in place. Secured with bolts while hole center marked with punch. Used for field installations where reference points unavailable or work area cramped and not easily accessible with conventional tools.

  1. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is used to correlate with FEA up to significantly higher frequencies. Automated, robotized measurements made it possible to easily capture 4000 geometric points per bell. Measurements were made for two carillon bells manufactured by John Taylor & Co., weighing about 100 and 150 kilos. The bells were mounted as freely as possible to allow them to resonate without constraint. They were excited with an electrodynamic shaker attached to an area of the bell where the clapper would normally strike. The frequency response functions (FRF) were collected for each geometry location, and solved to calculate the mode shape for each harmonic. Proprietary system software (Robovib and PSV from Polytec GmbH) was used to measure and capture data. The EMA was solved using industry standard tools from the Siemens PLM suite (LMS Test.Lab Polymax). The deviation for partials/harmonics (in cents) was found to be less than 1.6% from that predicted by the design rules. The mode shapes obtained from model based FEA analysis also correlated well with those from measurements.

  2. End points for validating early warning scores in the context of rapid response systems: a Delphi consensus study.

    PubMed

    Pedersen, N E; Oestergaard, D; Lippert, A

    2016-05-01

    When investigating early warning scores and similar physiology-based risk stratification tools, death, cardiac arrest and intensive care unit admission are traditionally used as end points. A large proportion of the patients identified by these end points cannot be saved, even with optimal treatment. This could pose a limitation to studies using these end points. We studied current expert opinion on end points for validating tools for the identification of patients in hospital wards at risk of imminent critical illness. The Delphi consensus methodology was used. We identified 22 experts based on objective criteria; 17 participated in the study. Each expert panel member's suggestions for end points were collected and distributed to the entire expert panel in anonymised form. The experts reviewed, rated and commented the suggested end points through the rounds in the Delphi process, and the experts' combined rating of the usefulness of each suggestion was established. A gross list of 86 suggestions for end points, relating to 13 themes, was produced. No items were uniformly recognised as ideal. The themes cardiac arrest, death, and level of care contained the items receiving highest ratings. End points relating to death, cardiac arrest and intensive care unit admission currently comprise the most obvious compromises for investigating early warning scores and similar risk stratification tools. Additional end points from the gross list of suggested end points could become feasible with the increased availability of large data sets with a multitude of recorded parameters. © 2015 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  3. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools

    PubMed Central

    Foster, Margaret J

    2011-01-01

    Objective: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. Methods: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Results: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. Conclusions: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed. PMID:21753917

  4. Proposed method of producing large optical mirrors Single-point diamond crushing followed by polishing with a small-area tool

    NASA Technical Reports Server (NTRS)

    Wright, G.; Bryan, J. B.

    1986-01-01

    Faster production of large optical mirrors may result from combining single-point diamond crushing of the glass with polishing using a small area tool to smooth the surface and remove the damaged layer. Diamond crushing allows a surface contour accurate to 0.5 microns to be generated, and the small area computer-controlled polishing tool allows the surface roughness to be removed without destroying the initial contour. Final contours with an accuracy of 0.04 microns have been achieved.

  5. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    ERIC Educational Resources Information Center

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  6. A New Predictive Tool for Optimization of the Treatment of Brain Metastases from Colorectal Cancer After Stereotactic Radiosurgery.

    PubMed

    Rades, Dirk; Dahlke, Markus; Gebauer, Niklas; Bartscht, Tobias; Hornung, Dagmar; Trang, Ngo Thuy; Phuong, Pham Cam; Khoa, Mai Trong; Gliemroth, Jan

    2015-10-01

    To develop a predictive tool for survival after stereotactic radiosurgery of brain metastases from colorectal cancer. Out of nine factors analyzed for survival, those showing significance (p<0.05) or a trend (p≤0.06) were included. For each factor, 0 (worse survival) or 1 (better survival) point was assigned. Total scores represented the sum of the factor scores. Performance status (p=0.010) and interval from diagnosis of colorectal cancer until radiosurgery (p=0.026) achieved significance, extracranial metastases showed a trend (p=0.06). These factors were included in the tool. Total scores were 0-3 points. Six-month survival rates were 17% for patients with 0, 25% for those with 1, 67% for those with 2 and 100% for those with 3 points; 12-month rates were 0%, 0%, 33% and 67%, respectively. Two groups were created: 0-1 and 2-3 points. Six- and 12-month survival rates were 20% vs. 78% and 0% vs. 44% (p=0.002), respectively. This tool helps optimize the treatment of patients after stereotactic radiosurgery for brain metastases from colorectal cancer. Copyright© 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  7. A point-based tool to predict conversion from mild cognitive impairment to probable Alzheimer's disease.

    PubMed

    Barnes, Deborah E; Cenzer, Irena S; Yaffe, Kristine; Ritchie, Christine S; Lee, Sei J

    2014-11-01

    Our objective in this study was to develop a point-based tool to predict conversion from amnestic mild cognitive impairment (MCI) to probable Alzheimer's disease (AD). Subjects were participants in the first part of the Alzheimer's Disease Neuroimaging Initiative. Cox proportional hazards models were used to identify factors associated with development of AD, and a point score was created from predictors in the final model. The final point score could range from 0 to 9 (mean 4.8) and included: the Functional Assessment Questionnaire (2‒3 points); magnetic resonance imaging (MRI) middle temporal cortical thinning (1 point); MRI hippocampal subcortical volume (1 point); Alzheimer's Disease Cognitive Scale-cognitive subscale (2‒3 points); and the Clock Test (1 point). Prognostic accuracy was good (Harrell's c = 0.78; 95% CI 0.75, 0.81); 3-year conversion rates were 6% (0‒3 points), 53% (4‒6 points), and 91% (7‒9 points). A point-based risk score combining functional dependence, cerebral MRI measures, and neuropsychological test scores provided good accuracy for prediction of conversion from amnestic MCI to AD. Copyright © 2014 The Alzheimer's Association. All rights reserved.

  8. Geo-Cultural Analysis Tool (trademark) (GCAT)

    DTIC Science & Technology

    2008-03-10

    Construction Engineering Research Laboratory (CERL) Champaign, IL Cold Regions Researc and Engineering Laboratory (CRREL) Hanover, NH European...Point B? Time/Day: Mid afternoon, Friday Mosque Leisure Market /Retail Area Entertainment District Departure Point Destination Point Transportation

  9. Opportunity Arm and Gagarin Rock, Sol 405

    NASA Image and Video Library

    2011-04-08

    NASA Mars Exploration Rover Opportunity used its rock abrasion tool on a rock informally named Gagarin, leaving a circular mark. At the end of the rover arm, the tool turret is positioned with the rock abrasion tool pointing upward.

  10. Evaluation of a clinical decision support tool for osteoporosis disease management: protocol for an interrupted time series design.

    PubMed

    Kastner, Monika; Sawka, Anna; Thorpe, Kevin; Chignel, Mark; Marquez, Christine; Newton, David; Straus, Sharon E

    2011-07-22

    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems. Although guidelines on assessing and managing osteoporosis are available, many patients are not receiving appropriate diagnostic testing or treatment. Findings from a systematic review of osteoporosis interventions, a series of mixed-methods studies, and advice from experts in osteoporosis and human-factors engineering were used collectively to develop a multicomponent tool (targeted to family physicians and patients at risk for osteoporosis) that may support clinical decision making in osteoporosis disease management at the point of care. A three-phased approach will be used to evaluate the osteoporosis tool. In phase 1, the tool will be implemented in three family practices. It will involve ensuring optimal functioning of the tool while minimizing disruption to usual practice. In phase 2, the tool will be pilot tested in a quasi-experimental interrupted time series (ITS) design to determine if it can improve osteoporosis disease management at the point of care. Phase 3 will involve conducting a qualitative postintervention follow-up study to better understand participants' experiences and perceived utility of the tool and readiness to adopt the tool at the point of care. The osteoporosis tool has the potential to make several contributions to the development and evaluation of complex, chronic disease interventions, such as the inclusion of an implementation strategy prior to conducting an evaluation study. Anticipated benefits of the tool may be to increase awareness for patients about osteoporosis and its associated risks and provide an opportunity to discuss a management plan with their physician, which may all facilitate patient self-management.

  11. Evaluation of a clinical decision support tool for osteoporosis disease management: protocol for an interrupted time series design

    PubMed Central

    2011-01-01

    Background Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems. Although guidelines on assessing and managing osteoporosis are available, many patients are not receiving appropriate diagnostic testing or treatment. Findings from a systematic review of osteoporosis interventions, a series of mixed-methods studies, and advice from experts in osteoporosis and human-factors engineering were used collectively to develop a multicomponent tool (targeted to family physicians and patients at risk for osteoporosis) that may support clinical decision making in osteoporosis disease management at the point of care. Methods A three-phased approach will be used to evaluate the osteoporosis tool. In phase 1, the tool will be implemented in three family practices. It will involve ensuring optimal functioning of the tool while minimizing disruption to usual practice. In phase 2, the tool will be pilot tested in a quasi-experimental interrupted time series (ITS) design to determine if it can improve osteoporosis disease management at the point of care. Phase 3 will involve conducting a qualitative postintervention follow-up study to better understand participants' experiences and perceived utility of the tool and readiness to adopt the tool at the point of care. Discussion The osteoporosis tool has the potential to make several contributions to the development and evaluation of complex, chronic disease interventions, such as the inclusion of an implementation strategy prior to conducting an evaluation study. Anticipated benefits of the tool may be to increase awareness for patients about osteoporosis and its associated risks and provide an opportunity to discuss a management plan with their physician, which may all facilitate patient self-management. PMID:21781318

  12. Personal Digital Assistants as Point-of-Care Tools in Long-Term Care Facilities: A Pilot Study

    ERIC Educational Resources Information Center

    Qadri, Syeda S.; Wang, Jia; Ruiz, Jorge G.; Roos, Bernard A.

    2009-01-01

    This study used both survey and interview questionnaires. It was designed to assess the feasibility, usability, and utility of two point-of-care tools especially prepared with information relevant for dementia care by staff nurses in a small, a medium-sized, and a large nursing home in Florida. Twenty-five LPN or RN nurses were recruited for the…

  13. Unsteady Heat-Flux Measurements of Second-Mode Instability Waves in a Hypersonic Boundary Layer

    NASA Technical Reports Server (NTRS)

    Kergerise, Michael A.; Rufer, Shann J.

    2016-01-01

    In this paper we report on the application of the atomic layer thermopile (ALTP) heat- flux sensor to the measurement of laminar-to-turbulent transition in a hypersonic flat plate boundary layer. The centerline of the flat-plate model was instrumented with a streamwise array of ALTP sensors and the flat-plate model was exposed to a Mach 6 freestream over a range of unit Reynolds numbers. Here, we observed an unstable band of frequencies that are associated with second-mode instability waves in the laminar boundary layer that forms on the flat-plate surface. The measured frequencies, group velocities, phase speeds, and wavelengths of these instability waves are in agreement with data previously reported in the literature. Heat flux time series, and the Morlet-wavelet transforms of them, revealed the wave-packet nature of the second-mode instability waves. In addition, a laser-based radiative heating system was developed to measure the frequency response functions (FRF) of the ALTP sensors used in the wind tunnel test. These measurements were used to assess the stability of the sensor FRFs over time and to correct spectral estimates for any attenuation caused by the finite sensor bandwidth.

  14. Unsteady heat-flux measurements of second-mode instability waves in a hypersonic flat-plate boundary layer

    NASA Astrophysics Data System (ADS)

    Kegerise, Michael A.; Rufer, Shann J.

    2016-08-01

    In this paper, we report on the application of the atomic layer thermopile (ALTP) heat-flux sensor to the measurement of laminar-to-turbulent transition in a hypersonic flat-plate boundary layer. The centerline of the flat-plate model was instrumented with a streamwise array of ALTP sensors, and the flat-plate model was exposed to a Mach 6 freestream over a range of unit Reynolds numbers. Here, we observed an unstable band of frequencies that are associated with second-mode instability waves in the laminar boundary layer that forms on the flat-plate surface. The measured frequencies, group velocities, phase speeds, and wavelengths of these instability waves are consistent with data previously reported in the literature. Heat flux time series, and the Morlet wavelet transforms of them, revealed the wave-packet nature of the second-mode instability waves. In addition, a laser-based radiative heating system was used to measure the frequency response functions (FRF) of the ALTP sensors used in the wind tunnel test. These measurements were used to assess the stability of the sensor FRFs over time and to correct spectral estimates for any attenuation caused by the finite sensor bandwidth.

  15. New pattern recognition system in the e-nose for Chinese spirit identification

    NASA Astrophysics Data System (ADS)

    Hui, Zeng; Qiang, Li; Yu, Gu

    2016-02-01

    This paper presents a new pattern recognition system for Chinese spirit identification by using the polymer quartz piezoelectric crystal sensor based e-nose. The sensors are designed based on quartz crystal microbalance (QCM) principle, and they could capture different vibration frequency signal values for Chinese spirit identification. For each sensor in an 8-channel sensor array, seven characteristic values of the original vibration frequency signal values, i.e., average value (A), root-mean-square value (RMS), shape factor value (Sf), crest factor value (Cf), impulse factor value (If), clearance factor value (CLf), kurtosis factor value (Kv) are first extracted. Then the dimension of the characteristic values is reduced by the principle components analysis (PCA) method. Finally the back propagation (BP) neutral network algorithm is used to recognize Chinese spirits. The experimental results show that the recognition rate of six kinds of Chinese spirits is 93.33% and our proposed new pattern recognition system can identify Chinese spirits effectively. Project supported by the National High Technology Research and Development Program of China (Grant No. 2013AA030901) and the Fundamental Research Funds for the Central Universities, China (Grant No. FRF-TP-14-120A2).

  16. Defining Coastal Storm and Quantifying Storms Applying Coastal Storm Impulse Parameter

    NASA Astrophysics Data System (ADS)

    Mahmoudpour, Nader

    2014-05-01

    What defines a storm condition and what would initiate a "storm" has not been uniquely defined among scientists and engineers. Parameters that have been used to define a storm condition can be mentioned as wind speed, beach erosion and storm hydrodynamics parameters such as wave height and water levels. Some of the parameters are storm consequential such as beach erosion and some are not directly related to the storm hydrodynamics such as wind speed. For the purpose of the presentation, the different storm conditions based on wave height, water levels, wind speed and beach erosion will be discussed and assessed. However, it sounds more scientifically to have the storm definition based on the hydrodynamic parameters such as wave height, water level and storm duration. Once the storm condition is defined and storm has initiated, the severity of the storm would be a question to forecast and evaluate the hazard and analyze the risk in order to determine the appropriate responses. The correlation of storm damages to the meteorological and hydrodynamics parameters can be defined as a storm scale, storm index or storm parameter and it is needed to simplify the complexity of variation involved developing the scale for risk analysis and response management. A newly introduced Coastal Storm Impulse (COSI) parameter quantifies storms into one number for a specific location and storm event. The COSI parameter is based on the conservation of linear, horizontal momentum to combine storm surge, wave dynamics, and currents over the storm duration. The COSI parameter applies the principle of conservation of momentum to physically combine the hydrodynamic variables per unit width of shoreline. This total momentum is then integrated over the duration of the storm to determine the storm's impulse to the coast. The COSI parameter employs the mean, time-averaged nonlinear (Fourier) wave momentum flux, over the wave period added to the horizontal storm surge momentum above the Mean High Water (MHW) integrated over the storm duration. The COSI parameter methodology has been applied to a 10-year data set from 1994 to 2003 at US Army Corps of Engineers, Field Research Facility (FRF) located on the Atlantic Ocean in Duck, North Carolina. The storm duration was taken as the length of time (hours) that the spectral significant wave heights were equal or greater than 1.6 meters for at least a 12 hour, continuous period. Wave heights were measured in 8 meters water depth and water levels measured at the NOAA/NOS tide gauge at the end of the FRF pier. The 10-year data set were analyzed applying the aforementioned storm criteria and produced 148 coastal events including Hurricanes and Northeasters. The results of this analysis and application of the COSI parameter to determine "Extra Ordinary" storms in Federal Projects for the Gulf of Mexico, 2012 hurricane season will be discussed at the time of presentation.

  17. A Comparison of Simplified-Visually Rich and Traditional Presentation Styles

    ERIC Educational Resources Information Center

    Johnson, Douglas A.; Christensen, Jack

    2011-01-01

    Microsoft PowerPoint and similar presentation tools have become commonplace in higher education, yet there is very little research on the effectiveness of different PowerPoint formats for implementing this software. This study compared two PowerPoint presentation techniques: a more traditional format employing heavy use of bullet points with text…

  18. MPLM On-Orbit Interface Dynamic Flexibility Modal Test

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Rodriguez, Pedro I.; Tinson, Ian; Fleming, Paolo

    2001-01-01

    Now that the International Space Station (ISS) is being constructed, payload developers have to not only verify the Shuttle-to-payload interface, but also the interfaces their payload will have with the ISS. The Multi Purpose Logistic Module (MPLM) being designed and built by Alenia Spazio in Torino, Italy is one such payload. The MPLM is the primary carrier for the ISS Payload Racks, Re-supply Stowage Racks, and the Resupply Stowage Platforms to re-supply the ISS with food, water, experiments, maintenance equipment and etc. During the development of the MPLM there was no requirement for verification of the on-orbit interfaces with the ISS. When this oversight was discovered, all the dynamic test stands had already been disassembled. A method was needed that would not require an extensive testing stand and could be completed in a short amount of time. The residual flexibility testing technique was chosen. The residual flexibility modal testing method consists of measuring the free-free natural frequencies and mode shapes along with the interface frequency response functions (FRF's). Analytically, the residual flexibility method has been investigated in detail by, MacNeal, Martinez, Carne, and Miller, and Rubin, but has not been implemented extensively for model correlation due to difficulties in data acquisition. In recent years improvement of data acquisition equipment has made possible the implementation of the residual flexibility method as in Admire, Tinker, and Ivey, and Klosterman and Lemon. The residual flexibility modal testing technique is applicable to a structure with distinct points (DOF) of contact with its environment, such as the MPLM-to-Station interface through the Common Berthing Mechanism (CBM). The CBM is bolted to a flange on the forward cone of the MPLM. During the fixed base test (to verify Shuttle interfaces) some data was gathered on the forward cone panels. Even though there was some data on the forward cones, an additional modal test was performed to better characterize its behavior. The CBM mounting flange is the only remaining structure of the MPLM that no test data was available. This paper discusses the implementation of the residual flexibility modal testing technique on the CBM flange and the modal test of the forward cone panels.

  19. Floating-Point Units and Algorithms for field-programmable gate arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Underwood, Keith D.; Hemmert, K. Scott

    2005-11-01

    The software that we are attempting to copyright is a package of floating-point unit descriptions and example algorithm implementations using those units for use in FPGAs. The floating point units are best-in-class implementations of add, multiply, divide, and square root floating-point operations. The algorithm implementations are sample (not highly flexible) implementations of FFT, matrix multiply, matrix vector multiply, and dot product. Together, one could think of the collection as an implementation of parts of the BLAS library or something similar to the FFTW packages (without the flexibility) for FPGAs. Results from this work has been published multiple times and wemore » are working on a publication to discuss the techniques we use to implement the floating-point units, For some more background, FPGAS are programmable hardware. "Programs" for this hardware are typically created using a hardware description language (examples include Verilog, VHDL, and JHDL). Our floating-point unit descriptions are written in JHDL, which allows them to include placement constraints that make them highly optimized relative to some other implementations of floating-point units. Many vendors (Nallatech from the UK, SRC Computers in the US) have similar implementations, but our implementations seem to be somewhat higher performance. Our algorithm implementations are written in VHDL and models of the floating-point units are provided in VHDL as well. FPGA "programs" make multiple "calls" (hardware instantiations) to libraries of intellectual property (IP), such as the floating-point unit library described here. These programs are then compiled using a tool called a synthesizer (such as a tool from Synplicity, Inc.). The compiled file is a netlist of gates and flip-flops. This netlist is then mapped to a particular type of FPGA by a mapper and then a place- and-route tool. These tools assign the gates in the netlist to specific locations on the specific type of FPGA chip used and constructs the required routes between them. The result is a "bitstream" that is analogous to a compiled binary. The bitstream is loaded into the FPGA to create a specific hardware configuration.« less

  20. TiConverter: A training image converting tool for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael

    2016-11-01

    TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.

  1. The Project Manager's Tool Kit

    NASA Technical Reports Server (NTRS)

    Cameron, W. Scott

    2003-01-01

    Project managers are rarely described as being funny. Moreover, a good sense of humor rarely seems to be one of the deciding factors in choosing someone to be a project manager, or something that pops up as a major discussion point at an annual performance review. Perhaps this is because people think you aren't serious about your work if you laugh. I disagree with this assessment, but that's not really my point. As I talk to people either pursuing a career in project management, or broadening their assignment to include project management, I encourage them to consider what tools they need to be successful. I suggest that they consider any strength they have to be part of their Project Management (PM) Tool Kit, and being funny could be one of the tools they need.

  2. BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & ...

  3. The verification of printability about marginal defects and the detectability at the inspection tool in sub 50nm node

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Jeong, Goomin; Seo, Kangjun; Kim, Sangchul; kim, changreol

    2008-05-01

    Since mask design rule is smaller and smaller, Defects become one of the issues dropping the mask yield. Furthermore controlled defect size become smaller while masks are manufactured. According to ITRS roadmap on 2007, controlled defect size is 46nm in 57nm node and 36nm in 45nm node on a mask. However the machine development is delayed in contrast with the speed of the photolithography development. Generally mask manufacturing process is divided into 3 parts. First part is patterning on a mask and second part is inspecting the pattern and repairing the defect on the mask. At that time, inspection tools of transmitted light type are normally used and are the most trustful as progressive type in the developed inspection tools until now. Final part is shipping the mask after the qualifying the issue points and weak points. Issue points on a mask are qualified by using the AIMS (Aerial image measurement system). But this system is including the inherent error possibility, which is AIMS measures the issue points based on the inspection results. It means defects printed on a wafer are over the specific size detected by inspection tools and the inspection tool detects the almost defects. Even though there are no tools to detect the 46nm and 36nm defects suggested by ITRS roadmap, this assumption is applied to manufacturing the 57nm and 45nm device. So we make the programmed defect mask consisted with various defect type such as spot, clear extension, dark extension and CD variation on L/S(line and space), C/H(contact hole) and Active pattern in 55nm and 45nm node. And the programmed defect mask was inspected by using the inspection tool of transmitted light type and was measured by using AIMS 45-193i. Then the marginal defects were compared between the inspection tool and AIMS. Accordingly we could verify whether defect size is proper or not, which was suggested to be controlled on a mask by ITRS roadmap. Also this result could suggest appropriate inspection tools for next generation device among the inspection tools of transmitted light type, reflected light type and aerial image type.

  4. Determination of real machine-tool settings and minimization of real surface deviation by computerized inspection

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Kuan, Chihping; Zhang, YI

    1991-01-01

    A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.

  5. Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies

    ERIC Educational Resources Information Center

    Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.

    2012-01-01

    In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…

  6. An evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool): Concurrent, face and content validity.

    PubMed

    De Groef, An; Van Kampen, Marijke; Moortgat, Peter; Anthonissen, Mieke; Van den Kerckhove, Eric; Christiaens, Marie-Rose; Neven, Patrick; Geraerts, Inge; Devoogdt, Nele

    2018-01-01

    To investigate the concurrent, face and content validity of an evaluation tool for Myofascial Adhesions in Patients after Breast Cancer (MAP-BC evaluation tool). 1) Concurrent validity of the MAP-BC evaluation tool was investigated by exploring correlations (Spearman's rank Correlation Coefficient) between the subjective scores (0 -no adhesions to 3 -very strong adhesions) of the skin level using the MAP-BC evaluation tool and objective elasticity parameters (maximal skin extension and gross elasticity) generated by the Cutometer Dual MPA 580. Nine different examination points on and around the mastectomy scar were evaluated. 2) Face and content validity were explored by questioning therapists experienced with myofascial therapy in breast cancer patients about the comprehensibility and comprehensiveness of the MAP-BC evaluation tool. 1) Only three meaningful correlations were found on the mastectomy scar. For the most lateral examination point on the mastectomy scar a moderate negative correlation (-0.44, p = 0.01) with the maximal skin extension and a moderate positive correlation with the resistance versus ability of returning or 'gross elasticity' (0.42, p = 0.02) were found. For the middle point on the mastectomy scar an almost moderate positive correlation with gross elasticity was found as well (0.38, p = 0.04) 2) Content and face validity have been found to be good. Eighty-nine percent of the respondent found the instructions understandable and 98% found the scoring system obvious. Thirty-seven percent of the therapists suggested to add the possibility to evaluate additional anatomical locations in case of reconstructive and/or bilateral surgery. The MAP-BC evaluation tool for myofascial adhesions in breast cancer patients has good face and content validity. Evidence for good concurrent validity of the skin level was found only on the mastectomy scar itself.

  7. Engineering specification and system design for CAD/CAM of custom shoes. Phase 5: UMC involvement (January 1, 1989 - June 30, 1989)

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1989-01-01

    The CAD/CAM of custom shoes is discussed. The solid object for machining is represented by a wireframe model with its nodes or vertices specified systematically in a grid pattern covering its entire length (point-to-point configuration). Two sets of data from CENCIT and CYBERWARE were used for machining purposes. It was found that the indexing technique (turning the stock by a small angle then moving the tool on a longitudinal path along the foot) yields the best result in terms of ease of programming, savings in wear and tear of the machine and cutting tools, and resolution of fine surface details. The work done using the LASTMOD last design system results in a shoe last specified by a number of congruent surface patches of different sizes. This data format was converted into a form amenable to the machine tool. It involves a series of sorting algorithms and interpolation algorithms to provide the grid pattern that the machine tool needs as was the case in the point to point configuration discussed above. This report also contains an in-depth treatment of the design and production technique of an integrated sole to complement the task of design and manufacture of the shoe last. Clinical data and essential production parameters are discussed. Examples of soles made through this process are given.

  8. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  9. Improved tool grinding machine

    DOEpatents

    Dial, C.E. Sr.

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thicknesses may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  10. Tool grinding machine

    DOEpatents

    Dial, Sr., Charles E.

    1980-01-01

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thickness may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  11. Fatal hand tool injuries in construction.

    PubMed

    Trent, R B; Wyant, W D

    1990-08-01

    Past research on occupational hand tool injuries has generally focused on nonfatal injuries. Most such injuries occur at the point where energy is transferred to the material being worked, eg, at the edge of a saw blade or the point of a drill. Assuming that hand tool injuries that are fatal will differ from nonfatal injuries, 62 Occupation Safety and Health Administration reports were analyzed. Four patterns emerged when the type of contact with energy was used to classify incidents. Fatal injuries occurred when (1) contact was made with energy that supplies power to the hand tool, (2) energy normally transferred to the material being worked is transferred to the worker, (3) workers or materials fall, and (4) potential energy is encountered in the work environment. Analysis showed that almost all such injuries could be prevented by application of existing safe work practices.

  12. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  13. Pilot study of a point-of-use decision support tool for cancer clinical trials eligibility.

    PubMed

    Breitfeld, P P; Weisburd, M; Overhage, J M; Sledge, G; Tierney, W M

    1999-01-01

    Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites.

  14. Pilot Study of a Point-of-use Decision Support Tool for Cancer Clinical Trials Eligibility

    PubMed Central

    Breitfeld, Philip P.; Weisburd, Marina; Overhage, J. Marc; Sledge, George; Tierney, William M.

    1999-01-01

    Many adults with cancer are not enrolled in clinical trials because caregivers do not have the time to match the patient's clinical findings with varying eligibility criteria associated with multiple trials for which the patient might be eligible. The authors developed a point-of-use portable decision support tool (DS-TRIEL) to automate this matching process. The support tool consists of a hand-held computer with a programmable relational database. A two-level hierarchic decision framework was used for the identification of eligible subjects for two open breast cancer clinical trials. The hand-held computer also provides protocol consent forms and schemas to further help the busy oncologist. This decision support tool and the decision framework on which it is based could be used for multiple trials and different cancer sites. PMID:10579605

  15. STS-57 Pilot Duffy uses TDS soldering tool in SPACEHAB-01 aboard OV-105

    NASA Technical Reports Server (NTRS)

    1993-01-01

    STS-57 Pilot Brian J. Duffy, at a SPACEHAB-01 (Commercial Middeck Augmentation Module (CMAM)) work bench, handles a soldering tool onboard the Earth-orbiting Endeavour, Orbiter Vehicle (OV) 105. Duffy is conducting a soldering experiment (SE) which is part of the Tools and Diagnostic Systems (TDS) project. He is soldering on a printed circuit board, positioned in a specially designed holder, containing 45 connection points and will later de-solder 35 points on a similar board. TDS' sponsor is the Flight Crew Support Division, Space and Life Sciences Directorate, JSC. It represents a group of equipment selected from tools and diagnostic hardware to be supported by the Space Station program. TDS was designed to demonstrate the maintenance of experiment hardware on-orbit and to evaluate the adequacy of its design and the crew interface.

  16. Onboard utilization of ground control points for image correction. Volume 3: Ground control point simulation software design

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software developed to simulate the ground control point navigation system is described. The Ground Control Point Simulation Program (GCPSIM) is designed as an analysis tool to predict the performance of the navigation system. The system consists of two star trackers, a global positioning system receiver, a gyro package, and a landmark tracker.

  17. Technology Integration in Science Classrooms: Framework, Principles, and Examples

    ERIC Educational Resources Information Center

    Kim, Minchi C.; Freemyer, Sarah

    2011-01-01

    A great number of technologies and tools have been developed to support science learning and teaching. However, science teachers and researchers point out numerous challenges to implementing such tools in science classrooms. For instance, guidelines, lesson plans, Web links, and tools teachers can easily find through Web-based search engines often…

  18. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  19. Tools and data acquisition of borehole geophysical logging for the Florida Power and Light Company Turkey Point Power Plant in support of a groundwater, surface-water, and ecological monitoring plan, Miami-Dade County, Florida

    USGS Publications Warehouse

    Wacker, Michael A.

    2010-01-01

    Borehole geophysical logs were obtained from selected exploratory coreholes in the vicinity of the Florida Power and Light Company Turkey Point Power Plant. The geophysical logging tools used and logging sequences performed during this project are summarized herein to include borehole logging methods, descriptions of the properties measured, types of data obtained, and calibration information.

  20. Automated selection of computed tomography display parameters using neural networks

    NASA Astrophysics Data System (ADS)

    Zhang, Di; Neu, Scott; Valentino, Daniel J.

    2001-07-01

    A collection of artificial neural networks (ANN's) was trained to identify simple anatomical structures in a set of x-ray computed tomography (CT) images. These neural networks learned to associate a point in an image with the anatomical structure containing the point by using the image pixels located on the horizontal and vertical lines that ran through the point. The neural networks were integrated into a computer software tool whose function is to select an index into a list of CT window/level values from the location of the user's mouse cursor. Based upon the anatomical structure selected by the user, the software tool automatically adjusts the image display to optimally view the structure.

  1. Method of casting silicon into thin sheets

    DOEpatents

    Sanjurjo, Angel; Rowcliffe, David J.; Bartlett, Robert W.

    1982-10-26

    Silicon (Si) is cast into thin shapes within a flat-bottomed graphite crucible by providing a melt of molten Si along with a relatively small amount of a molten salt, preferably NaF. The Si in the resulting melt forms a spherical pool which sinks into and is wetted by the molten salt. Under these conditions the Si will not react with any graphite to form SiC. The melt in the crucible is pressed to the desired thinness with a graphite tool at which point the tool is held until the mass in the crucible has been cooled to temperatures below the Si melting point, at which point the Si shape can be removed.

  2. The Audible Human Project: Modeling Sound Transmission in the Lungs and Torso

    NASA Astrophysics Data System (ADS)

    Dai, Zoujun

    Auscultation has been used qualitatively by physicians for hundreds of years to aid in the monitoring and diagnosis of pulmonary diseases. Alterations in the structure and function of the pulmonary system that occur in disease or injury often give rise to measurable changes in lung sound production and transmission. Numerous acoustic measurements have revealed the differences of breath sounds and transmitted sounds in the lung under normal and pathological conditions. Compared to the extensive cataloging of lung sound measurements, the mechanism of sound transmission in the pulmonary system and how it changes with alterations of lung structural and material properties has received less attention. A better understanding of sound transmission and how it is altered by injury and disease might improve interpretation of lung sound measurements, including new lung imaging modalities that are based on an array measurement of the acoustic field on the torso surface via contact sensors or are based on a 3-dimensional measurement of the acoustic field throughout the lungs and torso using magnetic resonance elastography. A long-term goal of the Audible Human Project (AHP ) is to develop a computational acoustic model that would accurately simulate generation, transmission and noninvasive measurement of sound and vibration within the pulmonary system and torso caused by both internal (e.g. respiratory function) and external (e.g. palpation) sources. The goals of this dissertation research, fitting within the scope of the AHP, are to develop specific improved theoretical understandings, computational algorithms and experimental methods aimed at transmission and measurement. The research objectives undertaken in this dissertation are as follows. (1) Improve theoretical modeling and experimental identification of viscoelasticity in soft biological tissues. (2) Develop a poroviscoelastic model for lung tissue vibroacoustics. (3) Improve lung airway acoustics modeling and its coupling to the lung parenchyma; and (4) Develop improved techniques in array acoustic measurement on the torso surface of sound transmitted through the pulmonary system and torso. Tissue Viscoelasticity. Two experimental identification approaches of shear viscoelasticity were used. The first approach is to directly estimate the frequency-dependent surface wave speed and then to optimize the coefficients in an assumed viscoelastic model type. The second approach is to measure the complex-valued frequency response function (FRF) between the excitation location and points at known radial distances. The FRF has embedded in it frequency-dependent information about both surface wave phase speed and attenuation that can be used to directly estimate the complex shear modulus. The coefficients in an assumed viscoelastic tissue model type can then be optimized. Poroviscoelasticity Model for Lung Vibro-acoustics. A poroviscoelastic model based on Biot theory of wave propagation in porous media was used for compression waves in the lungs. This model predicts a fast compression wave speed close to the one predicted by the effective medium theory at low frequencies and an additional slow compression wave due to the out of phase motion of the air and the lung parenchyma. Both compression wave speeds vary with frequency. The fast compression wave speed and attenuation were measured on an excised pig lung under two different transpulmonary pressures. Good agreement was achieved between the experimental observation and theoretical predictions. Sound Transmission in Airways and Coupling to Lung Parenchyma. A computer generated airway tree was simplified to 255 segments and integrated into the lung geometry from the Visible Human Male for numerical simulations. Acoustic impedance boundary conditions were applied at the ends of the terminal segments to represent the unmodeled downstream airway segments. Experiments were also carried out on a preserved pig lung and similar trends of lung surface velocity distribution were observed between the experiments and simulations. This approach provides a feasible way of simplifying the airway tree and greatly reduces the computation time. Acoustic Measurements of Sound Transmission in Human Subjects. Scanning laser Doppler vibrometry (SLDV) was used as a gold standard for transmitted sound measurements on a human subject. A low cost piezodisk sensor array was also constructed as an alternative to SLDV. The advantages and disadvantages of each technique are discussed.

  3. Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide

    USGS Publications Warehouse

    Smittle, Aaron M.; Shoberg, Thomas G.

    2017-06-16

    This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.

  4. Multi-modal imaging, model-based tracking, and mixed reality visualisation for orthopaedic surgery

    PubMed Central

    Fuerst, Bernhard; Tateno, Keisuke; Johnson, Alex; Fotouhi, Javad; Osgood, Greg; Tombari, Federico; Navab, Nassir

    2017-01-01

    Orthopaedic surgeons are still following the decades old workflow of using dozens of two-dimensional fluoroscopic images to drill through complex 3D structures, e.g. pelvis. This Letter presents a mixed reality support system, which incorporates multi-modal data fusion and model-based surgical tool tracking for creating a mixed reality environment supporting screw placement in orthopaedic surgery. A red–green–blue–depth camera is rigidly attached to a mobile C-arm and is calibrated to the cone-beam computed tomography (CBCT) imaging space via iterative closest point algorithm. This allows real-time automatic fusion of reconstructed surface and/or 3D point clouds and synthetic fluoroscopic images obtained through CBCT imaging. An adapted 3D model-based tracking algorithm with automatic tool segmentation allows for tracking of the surgical tools occluded by hand. This proposed interactive 3D mixed reality environment provides an intuitive understanding of the surgical site and supports surgeons in quickly localising the entry point and orienting the surgical tool during screw placement. The authors validate the augmentation by measuring target registration error and also evaluate the tracking accuracy in the presence of partial occlusion. PMID:29184659

  5. Toxic Release Inventory (TRI) (2017 EIC)

    EPA Pesticide Factsheets

    Focusing on air releases, explore tried and true access points along with new ways to access the data including the new P2 tool (currently available) and the TRI Analyzer tool (schedule to go public summer 2015)

  6. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  7. Development of Way Point Planning Tool in Response to NASA Field Campaign Challenges

    NASA Astrophysics Data System (ADS)

    He, M.; Hardin, D. M.; Conover, H.; Graves, S. J.; Meyer, P.; Blakeslee, R. J.; Goodman, M. L.

    2012-12-01

    Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  8. Reliability of a Market Basket Assessment Tool (MBAT) for Use in SNAP-Ed Healthy Retail Initiatives.

    PubMed

    Misyak, Sarah A; Hedrick, Valisa E; Pudney, Ellen; Serrano, Elena L; Farris, Alisha R

    2018-05-01

    To evaluate the reliability of the Market Basket Assessment Tool (MBAT) for assessing the availability of fruits and vegetables, low-fat or nonfat dairy and eggs, lean meats, whole-grain products, and seeds, beans, and nuts in Supplemental Nutrition Assistance Program-authorized retail environments. Different trained raters used the MBAT simultaneously at 14 retail environments to measure interrater reliability. Raters returned to 12 retail environments (85.7%) 1 week later to measure test-retest reliability. Data were analyzed using paired-sample t tests and correlations. No significant differences were found for interrater reliability or test-retest reliability for individual categories (mean differences, 0.0 to 0.3 ± 0.2 points) or total score (mean difference, 0.5 ± 0.4 points and (mean differences, 0.0 to 0.3 ± 0.3 points) or total score (mean difference, 0.8 ± 0.4 points), respectively. Future steps include validation of the MBAT. A low-burden tool can facilitate evaluation of efforts to promote healthful foods in retail environments. Copyright © 2018 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  9. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    PubMed

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely monitored by nurses to prevent falling during hospitalisations. © 2016 John Wiley & Sons Ltd.

  10. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.

  11. Alarms Philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Karen S; Kasemir, Kay

    2009-01-01

    An effective alarm system consists of a mechanism to monitor control points and generate alarm notifications, tools for operators to view, hear, acknowledge and handle alarms and a good configuration. Despite the availability of numerous fully featured tools, accelerator alarm systems continue to be disappointing to operations, frequently to the point of alarms being permanently silenced or totally ignored. This is often due to configurations that produce an excessive number of alarms or fail to communicate the required operator response. Most accelerator controls systems do a good job of monitoring specified points and generating notifications when parameters exceed predefined limits.more » In some cases, improved tools can help, but more often, poor configuration is the root cause of ineffective alarm systems. A SNS, we have invested considerable effort in generating appropriate configurations using a rigorous set of rules based on best practices in the industrial process controls community. This paper will discuss our alarm configuration philosophy and operator response to our new system.« less

  12. Optimal Number and Allocation of Data Collection Points for Linear Spline Growth Curve Modeling: A Search for Efficient Designs

    ERIC Educational Resources Information Center

    Wu, Wei; Jia, Fan; Kinai, Richard; Little, Todd D.

    2017-01-01

    Spline growth modelling is a popular tool to model change processes with distinct phases and change points in longitudinal studies. Focusing on linear spline growth models with two phases and a fixed change point (the transition point from one phase to the other), we detail how to find optimal data collection designs that maximize the efficiency…

  13. Higher-order gravity in higher dimensions: geometrical origins of four-dimensional cosmology?

    NASA Astrophysics Data System (ADS)

    Troisi, Antonio

    2017-03-01

    Determining the cosmological field equations is still very much debated and led to a wide discussion around different theoretical proposals. A suitable conceptual scheme could be represented by gravity models that naturally generalize Einstein theory like higher-order gravity theories and higher-dimensional ones. Both of these two different approaches allow one to define, at the effective level, Einstein field equations equipped with source-like energy-momentum tensors of geometrical origin. In this paper, the possibility is discussed to develop a five-dimensional fourth-order gravity model whose lower-dimensional reduction could provide an interpretation of cosmological four-dimensional matter-energy components. We describe the basic concepts of the model, the complete field equations formalism and the 5-D to 4-D reduction procedure. Five-dimensional f( R) field equations turn out to be equivalent, on the four-dimensional hypersurfaces orthogonal to the extra coordinate, to an Einstein-like cosmological model with three matter-energy tensors related with higher derivative and higher-dimensional counter-terms. By considering the gravity model with f(R)=f_0R^n the possibility is investigated to obtain five-dimensional power law solutions. The effective four-dimensional picture and the behaviour of the geometrically induced sources are finally outlined in correspondence to simple cases of such higher-dimensional solutions.

  14. Nonlinear damage identification of breathing cracks in Truss system

    NASA Astrophysics Data System (ADS)

    Zhao, Jie; DeSmidt, Hans

    2014-03-01

    The breathing cracks in truss system are detected by Frequency Response Function (FRF) based damage identification method. This method utilizes damage-induced changes of frequency response functions to estimate the severity and location of structural damage. This approach enables the possibility of arbitrary interrogation frequency and multiple inputs/outputs which greatly enrich the dataset for damage identification. The dynamical model of truss system is built using the finite element method and the crack model is based on fracture mechanics. Since the crack is driven by tensional and compressive forces of truss member, only one damage parameter is needed to represent the stiffness reduction of each truss member. Assuming that the crack constantly breathes with the exciting frequency, the linear damage detection algorithm is developed in frequency/time domain using Least Square and Newton Raphson methods. Then, the dynamic response of the truss system with breathing cracks is simulated in the time domain and meanwhile the crack breathing status for each member is determined by the feedback from real-time displacements of member's nodes. Harmonic Fourier Coefficients (HFCs) of dynamical response are computed by processing the data through convolution and moving average filters. Finally, the results show the effectiveness of linear damage detection algorithm in identifying the nonlinear breathing cracks using different combinations of HFCs and sensors.

  15. An Investigation Into the Effects of Frequency Response Function Estimators on Model Updating

    NASA Astrophysics Data System (ADS)

    Ratcliffe, M. J.; Lieven, N. A. J.

    1999-03-01

    Model updating is a very active research field, in which significant effort has been invested in recent years. Model updating methodologies are invariably successful when used on noise-free simulated data, but tend to be unpredictable when presented with real experimental data that are—unavoidably—corrupted with uncorrelated noise content. In the development and validation of model-updating strategies, a random zero-mean Gaussian variable is added to simulated test data to tax the updating routines more fully. This paper proposes a more sophisticated model for experimental measurement noise, and this is used in conjunction with several different frequency response function estimators, from the classical H1and H2to more refined estimators that purport to be unbiased. Finite-element model case studies, in conjunction with a genuine experimental test, suggest that the proposed noise model is a more realistic representation of experimental noise phenomena. The choice of estimator is shown to have a significant influence on the viability of the FRF sensitivity method. These test cases find that the use of the H2estimator for model updating purposes is contraindicated, and that there is no advantage to be gained by using the sophisticated estimators over the classical H1estimator.

  16. Orion MPCV Service Module Avionics Ring Pallet Testing, Correlation, and Analysis

    NASA Technical Reports Server (NTRS)

    Staab, Lucas; Akers, James; Suarez, Vicente; Jones, Trevor

    2012-01-01

    The NASA Orion Multi-Purpose Crew Vehicle (MPCV) is being designed to replace the Space Shuttle as the main manned spacecraft for the agency. Based on the predicted environments in the Service Module avionics ring, an isolation system was deemed necessary to protect the avionics packages carried by the spacecraft. Impact, sinusoidal, and random vibration testing were conducted on a prototype Orion Service Module avionics pallet in March 2010 at the NASA Glenn Research Center Structural Dynamics Laboratory (SDL). The pallet design utilized wire rope isolators to reduce the vibration levels seen by the avionics packages. The current pallet design utilizes the same wire rope isolators (M6-120-10) that were tested in March 2010. In an effort to save cost and schedule, the Finite Element Models of the prototype pallet tested in March 2010 were correlated. Frequency Response Function (FRF) comparisons, mode shape and frequency were all part of the correlation process. The non-linear behavior and the modeling the wire rope isolators proved to be the most difficult part of the correlation process. The correlated models of the wire rope isolators were taken from the prototype design and integrated into the current design for future frequency response analysis and component environment specification.

  17. An active structural acoustic control approach for the reduction of the structure-borne road noise

    NASA Astrophysics Data System (ADS)

    Douville, Hugo; Berry, Alain; Masson, Patrice

    2002-11-01

    The reduction of the structure-borne road noise generated inside the cabin of an automobile is investigated using an Active Structural Acoustic Control (ASAC) approach. First, a laboratory test bench consisting of a wheel/suspension/lower suspension A-arm assembly has been developed in order to identify the vibroacoustic transfer paths (up to 250 Hz) for realistic road noise excitation of the wheel. Frequency Response Function (FRF) measurements between the excitation/control actuators and each suspension/chassis linkage are used to characterize the different transfer paths that transmit energy through the chassis of the car. Second, a FE/BE model (Finite/Boundary Elements) was developed to simulate the acoustic field of an automobile cab interior. This model is used to predict the acoustic field inside the cabin as a response to the measured forces applied on the suspension/chassis linkages. Finally, an experimental implementation of ASAC is presented. The control approach relies on the use of inertial actuators to modify the vibration behavior of the suspension and the automotive chassis such that its noise radiation efficiency is decreased. The implemented algorithm consists of a MIMO (Multiple-Input-Multiple-Output) feedforward configuration with a filtered-X LMS algorithm using an advanced reference signal (width FIR filters) using the Simulink/Dspace environment for control prototyping.

  18. Toxic Release Inventory Training Course (TRI) (2015 EIC)

    EPA Pesticide Factsheets

    Focusing on air releases, explore tried and true access points along with new ways to access the data including the new P2 tool (currently available) and the TRI Analyzer tool (schedule to go public summer 2015)

  19. Storytelling: a leadership and educational tool.

    PubMed

    Kowalski, Karren

    2015-06-01

    A powerful tool that leaders and educators can use to engage the listeners-both staff and learners-is storytelling. Stories demonstrate important points, valuable lessons, and the behaviors that are preferred by the leader. Copyright 2015, SLACK Incorporated.

  20. Effect of tool geometry and cutting parameters on delamination and thrust forces in drilling CFRP/Al-Li

    NASA Astrophysics Data System (ADS)

    El Bouami, Souhail; Habak, Malek; Franz, Gérald; Velasco, Raphaël; Vantomme, Pascal

    2016-10-01

    Composite materials are increasingly used for structural parts in the aeronautic industries. Carbon Fiber-Reinforced Plastics (CFRP) are often used in combination with metallic materials, mostly aluminium alloys. This raises new problems in aircraft assembly. Delamination is one of these problems. In this study, CFRP/Al-Li stacks is used as experimental material for investigation effect of interaction of cutting parameters (cutting speed and feed rate) and tool geometry on delamination and thrust forces in drilling operation. A plan of experiments, based on Taguchi design method, was employed to investigate the influence of tool geometry and in particular the point angle and cutting parameters on delamination and axial effort. The experimental results demonstrate that the feed rate is the major parameter and the importance of tool point angle in delamination and thrust forces in the stacks were shown.

  1. Slip-Sliding-Away: A Review of the Literature on the Constraining Qualities of PowerPoint

    ERIC Educational Resources Information Center

    Kernbach, Sebastian; Bresciani, Sabrina; Eppler, Martin J.

    2015-01-01

    PowerPoint is a dominant communication tool in business and education. It allows for creating professional-looking presentations easily, but without understanding its constraining qualities it can be used inappropriately. Therefore we conducted a systematic literature review structuring the literature on PowerPoint in three chronological phases…

  2. A guide for recording esthetic and biologic changes with photographs

    Treesearch

    Arthur W. Magill; R.H. Twiss

    1965-01-01

    Photography has long been a useful tool for recording and analyzing environmental conditions. Permanent camera points can be established to help detect ,and analyze changes in the esthetics and ecology of wildland resources. This note describes the usefulness of permanent camera points and outlines procedures for establishing points and recording data.

  3. Teach Graphic Design Basics with PowerPoint

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Spotts, Thomas H.

    2007-01-01

    While PowerPoint is generally regarded as simply software for creating slide presentations, it includes often overlooked--but powerful--drawing tools. Because it is part of the Microsoft Office package, PowerPoint comes preloaded on many computers and thus is already available in many classrooms. Since most computers are not preloaded with good…

  4. Investigating the tool marks on oracle bones inscriptions from the Yinxu site (ca., 1319-1046 BC), Henan province, China.

    PubMed

    Zhao, Xiaolong; Tang, Jigen; Gu, Zhou; Shi, Jilong; Yang, Yimin; Wang, Changsui

    2016-09-01

    Oracle Bone Inscriptions in the Shang dynasty (1600-1046 BC) are the earliest well-developed writing forms of the Chinese character system, and their carving techniques have not been studied by tool marks analysis with microscopy. In this study, a digital microscope with three-dimensional surface reconstruction based on extended depth of focus technology was used to investigate tool marks on the surface of four pieces of oracle bones excavated at the eastern area of Huayuanzhuang, Yinxu site(ca., 1319-1046 BC), the last capital of the Shang dynasty, Henan province, China. The results show that there were two procedures to carve the characters on the analyzed tortoise shells. The first procedure was direct carving. The second was "outlining design," which means to engrave a formal character after engraving a draft with a pointed tool. Most of the strokes developed by an engraver do not overlap the smaller draft, which implies that the outlining design would be a sound way to avoid errors such as wrong and missing characters. The strokes of these characters have different shape at two ends and variations on width and depth of the grooves. Moreover, the bottom of the grooves is always rugged. Thus, the use of rotary wheel-cutting tools could be ruled out. In most cases, the starting points of the strokes are round or flat while the finishing points are always pointed. Moreover, the strokes should be engraved from top to bottom. When vertical or horizontal strokes had been engraved, the shell would be turned about 90 degrees to engrave the crossed strokes from top to bottom. There was no preferred order to engrave vertical or horizontal strokes. Since both sides of the grooves of the characters are neat and there exists no unorganized tool marks, then it is suggested that some sharp tools had been used for engraving characters on the shells. Microsc. Res. Tech. 79:827-832, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  5. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    NASA Astrophysics Data System (ADS)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  6. Technology: Presentations in the Cloud with a Twist

    ERIC Educational Resources Information Center

    Siegle, Del

    2011-01-01

    Technology tools have come a long way from early word processing applications and opportunities for students to engage in simple programming. Many tools now exist for students to develop and share products in a variety of formats and for a wide range of audiences. PowerPoint is probably the most ubiquitously used tool for student projects. In…

  7. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    ERIC Educational Resources Information Center

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  8. Habitat classification modeling with incomplete data: Pushing the habitat envelope

    USGS Publications Warehouse

    Zarnetske, P.L.; Edwards, T.C.; Moisen, Gretchen G.

    2007-01-01

    Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical techniques for HCMs. One option is to generate pseudo-absence points so that the many available statistical modeling tools can be used. Traditional techniques generate pseudoabsence points at random across broadly defined species ranges, often failing to include biological knowledge concerning the species-habitat relationship. We incorporated biological knowledge of the species-habitat relationship into pseudo-absence points by creating habitat envelopes that constrain the region from which points were randomly selected. We define a habitat envelope as an ecological representation of a species, or species feature's (e.g., nest) observed distribution (i.e., realized niche) based on a single attribute, or the spatial intersection of multiple attributes. We created HCMs for Northern Goshawk (Accipiter gentilis atricapillus) nest habitat during the breeding season across Utah forests with extant nest presence points and ecologically based pseudo-absence points using logistic regression. Predictor variables were derived from 30-m USDA Landfire and 250-m Forest Inventory and Analysis (FIA) map products. These habitat-envelope-based models were then compared to null envelope models which use traditional practices for generating pseudo-absences. Models were assessed for fit and predictive capability using metrics such as kappa, thresholdindependent receiver operating characteristic (ROC) plots, adjusted deviance (Dadj2), and cross-validation, and were also assessed for ecological relevance. For all cases, habitat envelope-based models outperformed null envelope models and were more ecologically relevant, suggesting that incorporating biological knowledge into pseudo-absence point generation is a powerful tool for species habitat assessments. Furthermore, given some a priori knowledge of the species-habitat relationship, ecologically based pseudo-absence points can be applied to any species, ecosystem, data resolution, and spatial extent. ?? 2007 by the Ecological Society of America.

  9. Calysto: Risk Management for Commercial Manned Spaceflight

    NASA Technical Reports Server (NTRS)

    Dillaman, Gary

    2012-01-01

    The Calysto: Risk Management for Commercial Manned Spaceflight study analyzes risk management in large enterprises and how to effectively communicate risks across organizations. The Calysto Risk Management tool developed by NASA's Kennedy Space Center's SharePoint team is used and referenced throughout the study. Calysto is a web-base tool built on Microsoft's SharePoint platform. The risk management process at NASA is examined and incorporated in the study. Using risk management standards from industry and specific organizations at the Kennedy Space Center, three methods of communicating and elevating risk are examined. Each method describes details of the effectiveness and plausibility of using the method in the Calysto Risk Management Tool. At the end of the study suggestions are made for future renditions of Calysto.

  10. Orangutans (Pongo pygmaeus) and bonobos (Pan paniscus) point to inform a human about the location of a tool.

    PubMed

    Zimmermann, Felizitas; Zemke, Franziska; Call, Josep; Gómez, Juan Carlos

    2009-03-01

    Although pointing is not part of great apes' natural gestural repertoire, they can learn to point to food, in order to request it. To assess the flexibility with which they can use this gesture, one can vary the potential referent of the point. In two previous studies, three orangutans (two of them human-reared) have shown the ability to point to the location of a tool which a human experimenter needed in order to give them food. Here, we tested six orangutans and five bonobos using a set-up in which our subjects had to guide a human experimenter to the hiding place of a fork which was needed in order to retrieve a piece of food for the subject out of a vertical tube. We further examined the potential role of a competitive/deceptive context by varying the identity of the person responsible for hiding the tool. In addition, we implemented three different control conditions in which an object was hidden but it was not necessary to indicate its location to get the food. We found that the majority of subjects spontaneously guided the experimenter to the hiding place of the fork by pointing to it when it was necessary and they did so significantly less in control conditions. We did not find an effect of the person hiding the fork. Our results show that mother-reared orangutans and bonobos are able to point to inform a human about the location of an object that the human needs to procure food for the subject and that they can take into account whether it is relevant or not to do so.

  11. The Molecular Structure of Penicillin

    ERIC Educational Resources Information Center

    Bentley, Ronald

    2004-01-01

    Overviews of the observations that constitute a structure proof for penicillin, specifically aimed at the general student population, are presented. Melting points and boiling points were criteria of purity and a crucial tool was microanalysis leading to empirical formulas.

  12. Point of care information services: a platform for self-directed continuing medical education for front line decision makers

    PubMed Central

    Moja, Lorenzo; Kwag, Koren Hyogene

    2015-01-01

    The structure and aim of continuing medical education (CME) is shifting from the passive transmission of knowledge to a competency-based model focused on professional development. Self-directed learning is emerging as the foremost educational method for advancing competency-based CME. In a field marked by the constant expansion of knowledge, self-directed learning allows physicians to tailor their learning strategy to meet the information needs of practice. Point of care information services are innovative tools that provide health professionals with digested evidence at the front line to guide decision making. By mobilising self-directing learning to meet the information needs of clinicians at the bedside, point of care information services represent a promising platform for competency-based CME. Several points, however, must be considered to enhance the accessibility and development of these tools to improve competency-based CME and the quality of care. PMID:25655251

  13. Application of a multi-beam vibrometer on industrial components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bendel, Karl

    2014-05-27

    Laser Doppler vibrometry is a well proven tool for the non-contact measurement of vibration. The scanning of several measurement points allows to visualize the deflection shape of the component, ideally a 3D-operating deflection shape, if a 3-D scanner is applied. Measuring the points sequentially, however, requires stationary behavior during the measurement time. This cannot be guaranteed for many real objects. Therefore, a multipoint laser Doppler vibrometer has been developed by Polytec and the University of Stuttgart with Bosch as industrial partner. A short description of the measurement system is given. Applications for the parallel measurement of the vibration of severalmore » points are shown for non-stationary vibrating Bosch components such as power-tools or valves.« less

  14. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    PubMed

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the control group, and 23.4% (8.9 of 38) for the improvement. All participants agreed that the cognitive task analysis learning tool was a useful training adjunct to learning in the operating room. To our knowledge, this is the first cognitive task analysis in diagnostic knee arthroscopy that is user-friendly and inexpensive and has demonstrated significant benefits in training. The IKACTA will provide trainees with a demonstrably strong foundation in diagnostic knee arthroscopy that will flatten learning curves in both technical skills and decision-making.

  15. Enhancing the revenue cycle experience for patients.

    PubMed

    Consolver, Patti; Phillips, Scott

    2014-09-01

    In 2013, Texas Health Resources began to record discussions with patients at each revenue cycle touch point, from scheduling through registration. The recordings give leaders insight on the accuracy and consistency of information communicated at each touch point and provide a tool for improving customer service. The initiative has improved patient satisfaction and increased point-of-service collections.

  16. Discrete Structure-Point Testing: Problems and Alternatives. TESL Reporter, Vol. 9, No. 4.

    ERIC Educational Resources Information Center

    Aitken, Kenneth G.

    This paper presents some reasons for reconsidering the use of discrete structure-point tests of language proficiency, and suggests an alternative basis for designing proficiency tests. Discrete point tests are one of the primary tools of the audio-lingual method of teaching a foreign language and are based on certain assumptions, including the…

  17. PowerPoint and Concept Maps: A Great Double Act

    ERIC Educational Resources Information Center

    Simon, Jon

    2015-01-01

    This article explores how concept maps can provide a useful addition to PowerPoint slides to convey interconnections of knowledge and help students see how knowledge is often non-linear. While most accounting educators are familiar with PowerPoint, they are likely to be less familiar with concept maps and this article shows how the tool can be…

  18. Teaching Point-Group Symmetry with Three-Dimensional Models

    ERIC Educational Resources Information Center

    Flint, Edward B.

    2011-01-01

    Three tools for teaching symmetry in the context of an upper-level undergraduate or introductory graduate course on the chemical applications of group theory are presented. The first is a collection of objects that have the symmetries of all the low-symmetry and high-symmetry point groups and the point groups with rotational symmetries from 2-fold…

  19. Using iPads as a Data Collection Tool in Extension Programming Evaluation

    ERIC Educational Resources Information Center

    Rowntree, J. E.; Witman, R. R.; Lindquist, G. L.; Raven, M. R.

    2013-01-01

    Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested…

  20. Method for machining steel with diamond tools

    DOEpatents

    Casstevens, J.M.

    1984-01-01

    The present invention is directed to a method for machine optical quality finishes and contour accuracies of workpieces of carbon-containing metals such as steel with diamond tooling. The wear rate of the diamond tooling is significantly reduced by saturating the atmosphere at the interface of the workpiece and the diamond tool with a gaseous hydrocarbon during the machining operation. The presence of the gaseous hydrocarbon effectively eliminates the deterioration of the diamond tool by inhibiting or preventing the conversion of the diamond carbon to graphite carbon at the point of contact between the cutting tool and the workpiece.

  1. Method for machining steel with diamond tools

    DOEpatents

    Casstevens, John M.

    1986-01-01

    The present invention is directed to a method for machining optical quality inishes and contour accuracies of workpieces of carbon-containing metals such as steel with diamond tooling. The wear rate of the diamond tooling is significantly reduced by saturating the atmosphere at the interface of the workpiece and the diamond tool with a gaseous hydrocarbon during the machining operation. The presence of the gaseous hydrocarbon effectively eliminates the deterioration of the diamond tool by inhibiting or preventing the conversion of the diamond carbon to graphite carbon at the point of contact between the cutting tool and the workpiece.

  2. Improving Students' Understanding of the Importance of Economic Consequences in Standard Setting: A Computerized Spreadsheet Tool.

    ERIC Educational Resources Information Center

    Ivancevich, Daniel M.; And Others

    1996-01-01

    Points out that political and economic pressures have sometimes caused the Financial Accounting Standards Board to alter standards. Presents a spreadsheet tool that demonstrates the economic consequences of adopting accounting standards. (SK)

  3. SE Requirements Development Tool User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, Faith Ann

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Departmentmore » of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.« less

  4. Quantification of microscopic surface features of single point diamond turned optics with subsequent chemical polishing

    NASA Astrophysics Data System (ADS)

    Cardenas, Nelson; Kyrish, Matthew; Taylor, Daniel; Fraelich, Margaret; Lechuga, Oscar; Claytor, Richard; Claytor, Nelson

    2015-03-01

    Electro-Chemical Polishing is routinely used in the anodizing industry to achieve specular surface finishes of various metals products prior to anodizing. Electro-Chemical polishing functions by leveling the microscopic peaks and valleys of the substrate, thereby increasing specularity and reducing light scattering. The rate of attack is dependent of the physical characteristics (height, depth, and width) of the microscopic structures that constitute the surface finish. To prepare the sample, mechanical polishing such as buffing or grinding is typically required before etching. This type of mechanical polishing produces random microscopic structures at varying depths and widths, thus the electropolishing parameters are determined in an ad hoc basis. Alternatively, single point diamond turning offers excellent repeatability and highly specific control of substrate polishing parameters. While polishing, the diamond tool leaves behind an associated tool mark, which is related to the diamond tool geometry and machining parameters. Machine parameters such as tool cutting depth, speed and step over can be changed in situ, thus providing control of the spatial frequency of the microscopic structures characteristic of the surface topography of the substrate. By combining single point diamond turning with subsequent electro-chemical etching, ultra smooth polishing of both rotationally symmetric and free form mirrors and molds is possible. Additionally, machining parameters can be set to optimize post polishing for increased surface quality and reduced processing times. In this work, we present a study of substrate surface finish based on diamond turning tool mark spatial frequency with subsequent electro-chemical polishing.

  5. Design and development of an interactive medical teleconsultation system over the World Wide Web.

    PubMed

    Bai, J; Zhang, Y; Dai, B

    1998-06-01

    The objective of the medical teleconsultation system presented in this paper is to demonstrate the use of the World Wide Web (WWW) for telemedicine and interactive medical information exchange. The system, which is developed based on Java, could provide several basic Java tools to fulfill the requirements of medical applications, including a file manager, data tool, bulletin board, and digital audio tool. The digital audio tool uses point-to-point structure to enable two physicians to communicate directly through voice. The others use multipoint structure. The file manager manages the medical images stored in the WWW information server, which come from a hospital database. The data tool supports cooperative operations on the medical data between the participating physicians. The bulletin board enables the users to discuss special cases by writing text on the board, send their personal or group diagnostic reports on the cases, and reorganize the reports and store them in its report file for later use. The system provides a hardware-independent platform for physicians to interact with one another as well as to access medical information over the WWW.

  6. Optimal Synthesis of Compliant Mechanisms using Subdivision and Commercial FEA (DETC2004-57497)

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Canfield, Stephen

    2004-01-01

    The field of distributed-compliance mechanisms has seen significant work in developing suitable topology optimization tools for their design. These optimal design tools have grown out of the techniques of structural optimization. This paper will build on the previous work in topology optimization and compliant mechanism design by proposing an alternative design space parameterization through control points and adding another step to the process, that of subdivision. The control points allow a specific design to be represented as a solid model during the optimization process. The process of subdivision creates an additional number of control points that help smooth the surface (for example a C(sup 2) continuous surface depending on the method of subdivision chosen) creating a manufacturable design free of some traditional numerical instabilities. Note that these additional control points do not add to the number of design parameters. This alternative parameterization and description as a solid model effectively and completely separates the design variables from the analysis variables during the optimization procedure. The motivation behind this work is to create an automated design tool from task definition to functional prototype created on a CNC or rapid-prototype machine. This paper will describe the proposed compliant mechanism design process and will demonstrate the procedure on several examples common in the literature.

  7. Online machining error estimation method of numerical control gear grinding machine tool based on data analysis of internal sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin

    2016-12-01

    This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.

  8. Effectiveness of the CANRISK tool in the identification of dysglycemia in First Nations and Métis in Canada

    PubMed Central

    Gina, Agarwal; Ying, Jiang; Susan, Rogers Van Katwyk; Chantal, Lemieux; Heather, Orpana; Yang, Mao; Brandan, Hanley; Karen, Davis; Laurel, Leuschen; Howard, Morrison

    2018-01-01

    Abstract Introduction: First Nations/Métis populations develop diabetes earlier and at higher rates than other Canadians. The Canadian diabetes risk questionnaire (CANRISK) was developed as a diabetes screening tool for Canadians aged 40 years or over. The primary aim of this paper is to assess the effectiveness of the existing CANRISK tool and risk scores in detecting dysglycemia in First Nations/Métis participants, including among those under the age of 40. A secondary aim was to determine whether alternative waist circumference (WC) and body mass index (BMI) cut-off points improved the predictive ability of logistic regression models using CANRISK variables to predict dysglycemia. Methods: Information from a self-administered CANRISK questionnaire, anthropometric measurements, and results of a standard oral glucose tolerance test (OGTT) were collected from First Nations and Métis participants (n = 1479). Sensitivity and specificity of CANRISK scores using published risk score cut-off points were calculated. Logistic regression was conducted with alternative ethnicity-specific BMI and WC cut-off points to predict dysglycemia using CANRISK variables. Results: Compared with OGTT results, using a CANRISK score cut-off point of 33, the sensitivity and specificity of CANRISK was 68% and 63% among individuals aged 40 or over; it was 27% and 87%, respectively among those under 40. Using a lower cut-off point of 21, the sensitivity for individuals under 40 improved to 77% with a specificity of 44%. Though specificity at this threshold was low, the higher level of sensitivity reflects the importance of the identification of high risk individuals in this population. Despite altered cut-off points of BMI and WC, logistic regression models demonstrated similar predictive ability. Conclusion: CANRISK functioned well as a preliminary step for diabetes screening in a broad age range of First Nations and Métis in Canada, with an adjusted CANRISK cutoff point for individuals under 40, and with no incremental improvement from using alternative BMI/WC cut-off points. PMID:29443485

  9. Exposure Assessment Tools by Approaches - Direct Measurement (Point-of-Contact Measurement)

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode

  10. Pain point system scale (PPSS): a method for postoperative pain estimation in retrospective studies

    PubMed Central

    Gkotsi, Anastasia; Petsas, Dimosthenis; Sakalis, Vasilios; Fotas, Asterios; Triantafyllidis, Argyrios; Vouros, Ioannis; Saridakis, Evangelos; Salpiggidis, Georgios; Papathanasiou, Athanasios

    2012-01-01

    Purpose Pain rating scales are widely used for pain assessment. Nevertheless, a new tool is required for pain assessment needs in retrospective studies. Methods The postoperative pain episodes, during the first postoperative day, of three patient groups were analyzed. Each pain episode was assessed by a visual analog scale, numerical rating scale, verbal rating scale, and a new tool – pain point system scale (PPSS) – based on the analgesics administered. The type of analgesic was defined based on the authors’ clinic protocol, patient comorbidities, pain assessment tool scores, and preadministered medications by an artificial neural network system. At each pain episode, each patient was asked to fill the three pain scales. Bartlett’s test and Kaiser–Meyer–Olkin criterion were used to evaluate sample sufficiency. The proper scoring system was defined by varimax rotation. Spearman’s and Pearson’s coefficients assessed PPSS correlation to the known pain scales. Results A total of 262 pain episodes were evaluated in 124 patients. The PPSS scored one point for each dose of paracetamol, three points for each nonsteroidal antiinflammatory drug or codeine, and seven points for each dose of opioids. The correlation between the visual analog scale and PPSS was found to be strong and linear (rho: 0.715; P < 0.001 and Pearson: 0.631; P < 0.001). Conclusion PPSS correlated well with the known pain scale and could be used safely in the evaluation of postoperative pain in retrospective studies. PMID:23152699

  11. [Translation and validation of the Spanish version of the EAT-10 (Eating Assessment Tool-10) for the screening of dysphagia].

    PubMed

    Burgos, R; Sarto, B; Segurola, H; Romagosa, A; Puiggrós, C; Vázquez, C; Cárdenas, G; Barcons, N; Araujo, K; Pérez-Portabella, C

    2012-01-01

    The Eating Assessment Tool-10 (EAT-10) is a self-administered, analogical, direct-scoring screening tool for dysphagia. To translate and adapt the EAT-10 into Spanish, and to evaluate its psychometric properties. After the translation and back-translation process of the EAT-10 ES, a prospective study was performed in adult patients with preserved cognitive and functional abilities. Patients in 3 clinical situations, diagnosed with dysphagia (DD), patients at risk of dysphagia (RD), and patients not at risk of dysphagia (SRD) were recruited from 3 settings: a hospital Nutritional Support Unit (USN), a nursing home (RG) and primary care centre (CAP). Patients completed the EAT-10 ES during a single visit. Both patients and researchers completed a specific questionnaire regarding EAT-10 ES' comprehension. 65 patients were included (age 75 ± 9.1 y), 52.3% women. Mean time of administration was 3.8 ± 1.7 minutes. 95.4% of patients considered that all tool items were comprehensible and 72.3% found it easy to assign scores. EAT-10 ES' internal consistency, Cronbach's Alpha coefficient was 0.87. A high correlation was observed between all tool items and global scores (p < 0.001). Mean score for patients in group DD was 15 ± 8.9 points, 6.7 ± 7.7 points in group RD, and 2 ± 3.1 points in group SRD. Male patients, previously diagnosed of dysphagia or patients from the NSU showed significantly higher scores on the EAT-10 ES (p < 0.001). EAT-10 ES has proven to be reliable, valid and to have internal consistency. Is it an easy-to-understand tool that can be completed quickly, making it useful for the screening of dysphagia in routine clinical practice.

  12. EVA tools and equipment reference book

    NASA Technical Reports Server (NTRS)

    Fullerton, R. K.

    1993-01-01

    This document contains a mixture of tools and equipment used throughout the space shuttle-based extravehicular activity (EVA) program. Promising items which have reached the prototype stage of development are also included, but should not be considered certified ready for flight. Each item is described with a photo, a written discussion, technical specifications, dimensional drawings, and points of contact for additional information. Numbers on the upper left-hand corner of each photo may be used to order specific pictures from NASA and contractor photo libraries. Points of contact were classified as either operational or technical. An operational contact is an engineer from JSC Mission Operations Directorate who is familiar with the basic function and on-orbit use of the tool. A technical contact would be the best source of detailed technical specifications and is typically the NASA subsystem manager. The technical information table for each item uses the following terms to describe the availability or status of each hardware item: Standard - Flown on every mission as standard manifest; Flight specific - Potentially available for flight, not flown every mission (flight certification cannot be guaranteed and recertification may be required); Reference only - Item no longer in active inventory or not recommended for future use, some items may be too application-specific for general use; and Developmental - In the prototype stage only and not yet available for flight. The current availability and certification of any flight-specific tool should be verified with the technical point of contact. Those tools built and fit checked for Hubble Space Telescope maintenance are program dedicated and are not available to other customers. Other customers may have identical tools built from the existing, already certified designs as an optional service.

  13. Graphical Representations and Odds Ratios in a Distance-Association Model for the Analysis of Cross-Classified Data

    ERIC Educational Resources Information Center

    de Rooij, Mark; Heiser, Willem J.

    2005-01-01

    Although RC(M)-association models have become a generally useful tool for the analysis of cross-classified data, the graphical representation resulting from such an analysis can at times be misleading. The relationships present between row category points and column category points cannot be interpreted by inter point distances but only through…

  14. Why Thought Experiments Should Be Used as an Educational Tool to Develop Problem-Solving Skills and Creativity of the Gifted Students?

    ERIC Educational Resources Information Center

    Tortop, Hasan Said

    2016-01-01

    Many educational tools that are recommended for the training of normal students are often encountered in programs that do not work very well and are subsequently abandoned. One of the important points that program developers should now consider is that teaching tools are presented in accordance with individual differences. It is seen that the…

  15. The Development and Pilot Testing of the Marijuana Retail Surveillance Tool (MRST): Assessing Marketing and Point-of-Sale Practices among Recreational Marijuana Retailers

    ERIC Educational Resources Information Center

    Berg, Carla J.; Henriksen, Lisa; Cavazos-Rehg, Patricia; Schauer, Gillian L.; Freisthler, Bridget

    2017-01-01

    As recreational marijuana expands, it is critical to develop standardized surveillance measures to study the retail environment. To this end, our research team developed and piloted a tool assessing recreational marijuana retailers in a convenience sample of 20 Denver retailers in 2016. The tool assesses: (i) compliance and security (e.g.…

  16. Tools for automated acoustic monitoring within the R package monitoR

    USGS Publications Warehouse

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  17. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  18. Searches for point sources in the Galactic Center region

    NASA Astrophysics Data System (ADS)

    di Mauro, Mattia; Fermi-LAT Collaboration

    2017-01-01

    Several groups have demonstrated the existence of an excess in the gamma-ray emission around the Galactic Center (GC) with respect to the predictions from a variety of Galactic Interstellar Emission Models (GIEMs) and point source catalogs. The origin of this excess, peaked at a few GeV, is still under debate. A possible interpretation is that it comes from a population of unresolved Millisecond Pulsars (MSPs) in the Galactic bulge. We investigate the detection of point sources in the GC region using new tools which the Fermi-LAT Collaboration is developing in the context of searches for Dark Matter (DM) signals. These new tools perform very fast scans iteratively testing for additional point sources at each of the pixels of the region of interest. We show also how to discriminate between point sources and structural residuals from the GIEM. We apply these methods to the GC region considering different GIEMs and testing the DM and MSPs intepretations for the GC excess. Additionally, we create a list of promising MSP candidates that could represent the brightest sources of a MSP bulge population.

  19. Continuous Personal Improvement.

    ERIC Educational Resources Information Center

    Emiliani, M. L.

    1998-01-01

    Suggests that continuous improvement tools used in the workplace can be applied to self-improvement. Explains the use of such techniques as one-piece flow, kanban, visual controls, and total productive maintenance. Points out misapplications of these tools and describes the use of fishbone diagrams to diagnose problems. (SK)

  20. Evaluation of the phosphorus site assessment tools:lessons from the U.S.

    USDA-ARS?s Scientific Manuscript database

    Freshwater eutrophication is generally limited or accelerated by phosphorus (P) inputs, with agriculture considered a contributor along with point sources. To help assess the impairments, NRCS incorporated the P Indexing risk assessment tool into the 590 Nutrient Management Conservation Practice St...

  1. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  2. Comparisons among tools, surface orientation, and pencil grasp for children 23 months of age.

    PubMed

    Yakimishyn, Janet E; Magill-Evans, Joyce

    2002-01-01

    The purpose of this study was to determine whether writing tool type and angle of writing surface affect grasp. Fifty-one children 23 to 24 months of age who were typically developing drew with a primary marker, colored pencil, and small piece of crayon on a table and an easel. The marker and pencil were presented pointing left, right, and toward the child. The order of writing tool presentation was counterbalanced. Grasps were scored with a 5-point rating system and analyzed with dependent t tests. Children used a more mature grasp when drawing with a piece of crayon than with a pencil. No difference in grasp maturity was found when using a pencil compared with a marker. A more mature grasp when drawing on the easel compared with the table was used with the crayon but not with the marker or pencil. Results imply that a short writing tool combined with a vertical surface can influence the grasp of young children.

  3. STS-57 Pilot Duffy uses TDS soldering tool in SPACEHAB-01 aboard OV-105

    NASA Image and Video Library

    1993-07-01

    STS057-30-021 (21 June-1 July 1993) --- Astronaut Brian Duffy, pilot, handles a soldering tool onboard the Earth-orbiting Space Shuttle Endeavour. The Soldering Experiment (SE) called for a crew member to solder on a printed circuit board containing 45 connection points, then de-solder 35 points on a similar board. The SE was part of a larger project called the Tools and Diagnostic Systems (TDS), sponsored by the Space and Life Sciences Directorate at Johnson Space Center (JSC). TDS represents a group of equipment selected from the tools and diagnostic hardware to be supported by the International Space Station program. TDS was designed to demonstrate the maintenance of experiment hardware on-orbit and to evaluate the adequacy of its design and the crew interface. Duffy and five other NASA astronauts spent almost ten days aboard the Space Shuttle Endeavour in Earth-orbit supporting the SpaceHab mission, retrieving the European Retrievable Carrier (EURECA) and conducting various experiments.

  4. A research protocol for developing a Point-Of-Care Key Evidence Tool 'POCKET': a checklist for multidimensional evidence reporting on point-of-care in vitro diagnostics.

    PubMed

    Huddy, Jeremy R; Ni, Melody; Mavroveli, Stella; Barlow, James; Williams, Doris-Ann; Hanna, George B

    2015-07-10

    Point-of-care in vitro diagnostics (POC-IVD) are increasingly becoming widespread as an acceptable means of providing rapid diagnostic results to facilitate decision-making in many clinical pathways. Evidence in utility, usability and cost-effectiveness is currently provided in a fragmented and detached manner that is fraught with methodological challenges given the disruptive nature these tests have on the clinical pathway. The Point-of-care Key Evidence Tool (POCKET) checklist aims to provide an integrated evidence-based framework that incorporates all required evidence to guide the evaluation of POC-IVD to meet the needs of policy and decisionmakers in the National Health Service (NHS). A multimethod approach will be applied in order to develop the POCKET. A thorough literature review has formed the basis of a robust Delphi process and validation study. Semistructured interviews are being undertaken with POC-IVD stakeholders, including industry, regulators, commissioners, clinicians and patients to understand what evidence is required to facilitate decision-making. Emergent themes will be translated into a series of statements to form a survey questionnaire that aims to reach a consensus in each stakeholder group to what needs to be included in the tool. Results will be presented to a workshop to discuss the statements brought forward and the optimal format for the tool. Once assembled, the tool will be field-tested through case studies to ensure validity and usability and inform refinement, if required. The final version will be published online with a call for comments. Limitations include unpredictable sample representation, development of compromise position rather than consensus, and absence of blinding in validation exercise. The Imperial College Joint Research Compliance Office and the Imperial College Hospitals NHS Trust R&D department have approved the protocol. The checklist tool will be disseminated through a PhD thesis, a website, peer-reviewed publication, academic conferences and formal presentations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Combining abdominal and cosmetic breast surgery does not increase short-term complication rates: a comparison of each individual procedure and pretreatment risk stratification tool.

    PubMed

    Khavanin, Nima; Jordan, Sumanas W; Vieira, Brittany L; Hume, Keith M; Mlodinow, Alexei S; Simmons, Christopher J; Murphy, Robert X; Gutowski, Karol A; Kim, John Y S

    2015-11-01

    Combined abdominal and breast surgery presents a convenient and relatively cost-effective approach for accomplishing both procedures. This study is the largest to date assessing the safety of combined procedures, and it aims to develop a simple pretreatment risk stratification method for patients who desire a combined procedure. All women undergoing abdominoplasty, panniculectomy, augmentation mammaplasty, and/or mastopexy in the TOPS database were identified. Demographics and outcomes for combined procedures were compared to individual procedures using χ(2) and Student's t-tests. Multiple logistic regression provided adjusted odds ratios for the effect of a combined procedure on 30-day complications. Among combined procedures, a logistic regression model determined point values for pretreatment risk factors including diabetes (1 point), age over 53 (1), obesity (2), and 3+ ASA status (3), creating a 7-point pretreatment risk stratification tool. A total of 58,756 cases met inclusion criteria. Complication rates among combined procedures (9.40%) were greater than those of aesthetic breast surgery (2.66%; P < .001) but did not significantly differ from abdominal procedures (9.75%; P = .530). Nearly 77% of combined cases were classified as low-risk (0 points total) with a 9.78% complication rates. Medium-risk patients (1 to 3 points) had a 16.63% complication rate, and high-risk (4 to 7 points) 38.46%. Combining abdominal and breast procedures is safe in the majority of patients and does not increase 30-day complications rates. The risk stratification tool can continue to ensure favorable outcomes for patients who may desire a combined surgery. 4 Risk. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  6. Sasquatch Footprint Tool

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin

    2013-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is the parachute system for NASA s Orion spacecraft. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted or released from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish an aircraft release point that will ensure that the article and all items released from it will land in safe locations. A new footprint predictor tool, called Sasquatch, was created in MATLAB. This tool takes in a simulated trajectory for the test article, information about all released objects, and atmospheric wind data (simulated or actual) to calculate the trajectories of the released objects. Dispersions are applied to the landing locations of those objects, taking into account the variability of winds, aircraft release point, and object descent rate. Sasquatch establishes a payload release point (e.g., where the payload will be extracted from the carrier aircraft) that will ensure that the payload and all objects released from it will land in a specified cleared area. The landing locations (the final points in the trajectories) are plotted on a map of the test range. Sasquatch was originally designed for CPAS drop tests and includes extensive information about both the CPAS hardware and the primary test range used for CPAS testing. However, it can easily be adapted for more complex CPAS drop tests, other NASA projects, and commercial partners. CPAS has developed the Sasquatch footprint tool to ensure range safety during parachute drop tests. Sasquatch is well correlated to test data and continues to ensure the safety of test personnel as well as the safe recovery of all equipment. The tool will continue to be modified based on new test data, improving predictions and providing added capability to meet the requirements of more complex testing.

  7. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  8. Development of POINTS as a planetology instrument

    NASA Technical Reports Server (NTRS)

    Reasenberg, Robert D.

    1994-01-01

    During the reporting period, we carried out investigations required to enhance our design of POINTS as a tool for the search for and characterization of extra-solar planetary systems. The results of that work were included in a paper on POINTS as well as one on Newcomb, which will soon appear in the proceedings of SPIE Conference 2200. (Newcomb is a spinoff of POINTS. It is a small astrometric interferometer now being developed jointly by SAO and the U.S. Navy. It could help establish some of the technology needed for POINTS.) These papers are appended.

  9. Points of attention in designing tools for regional brownfield prioritization.

    PubMed

    Limasset, Elsa; Pizzol, Lisa; Merly, Corinne; Gatchett, Annette M; Le Guern, Cécile; Martinát, Stanislav; Klusáček, Petr; Bartke, Stephan

    2018-05-01

    The regeneration of brownfields has been increasingly recognized as a key instrument in sustainable land management, since free developable land (or so called "greenfields") has become a scare and more expensive resource, especially in densely populated areas. However, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects, thus requiring the development and application of appropriate tools to support decision makers in the selection of promising sites where efficiently allocate the limited financial resources. The design of effective prioritization tools is a complex process, which requires the analysis and consideration of critical points of attention (PoAs) which has been identified considering the state of the art in literature, and lessons learned from previous developments of regional brownfield (BF) prioritization processes, frameworks and tools. Accordingly, we identified 5 PoAs, namely 1) Assessing end user needs and orientation discussions, 2) Availability and quality of the data needed for the BF prioritization tool, 3) Communication and stakeholder engagement 4) Drivers of regeneration success, and 5) Financing and application costs. To deepen and collate the most recent knowledge on the topics from scientists and practitioners, we organized a focus group discussion within a special session at the AquaConSoil (ACS) conference 2017, where participants were asked to add their experience and thoughts to the discussion in order to identify the most significant and urgent points of attention in BF prioritization tool design. The result of this assessment is a comprehensive table (Table 2), which can support problem owners, investors, service providers, regulators, public and private land managers, decision makers etc. in the identification of the main aspects (sub-topics) to be considered and their relative influences and in the comprehension of the general patterns and challenges to be faced when dealing with the development of BF prioritization tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. GIS-based interactive tool to map the advent of world conquerors

    NASA Astrophysics Data System (ADS)

    Lakkaraju, Mahesh

    The objective of this thesis is to show the scale and extent of some of the greatest empires the world has ever seen. This is a hybrid project between the GIS based interactive tool and the web-based JavaScript tool. This approach lets the students learn effectively about the emperors themselves while understanding how long and far their empires spread. In the GIS based tool, a map is displayed with various points on it, and when a user clicks on one point, the relevant information of what happened at that particular place is displayed. Apart from this information, users can also select the interactive animation button and can walk through a set of battles in chronological order. As mentioned, this uses Java as the main programming language, and MOJO (Map Objects Java Objects) provided by ESRI. MOJO is very effective as its GIS related features can be included in the application itself. This app. is a simple tool and has been developed for university or high school level students. D3.js is an interactive animation and visualization platform built on the Javascript framework. Though HTML5, CSS3, Javascript and SVG animations can be used to derive custom animations, this tool can help bring out results with less effort and more ease of use. Hence, it has become the most sought after visualization tool for multiple applications. D3.js has provided a map-based visualization feature so that we can easily display text-based data in a map-based interface. To draw the map and the points on it, D3.js uses data rendered in TOPO JSON format. The latitudes and longitudes can be provided, which are interpolated into the Map svg. One of the main advantages of doing it this way is that more information is retained when we use a visual medium.

  11. Asteroid Deflection Mission Design Considering On-Ground Risks

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter

    The deflection of an Earth-threatening asteroid requires high transparency of the mission design process. The goal of such a mission is to move the projected point of impact over the face of Earth until the asteroid is on a miss trajectory. During the course of deflection operations, the projected point of impact will match regions that were less affected before alteration of the asteroid’s trajectory. These regions are at risk of sustaining considerable damage if the deflecting spacecraft becomes non-operational. The projected impact point would remain where the deflection mission put it at the time of mission failure. Hence, all regions that are potentially affected by the deflection campaign need to be informed about this risk and should be involved in the mission design process. A mission design compromise will have to be found that is acceptable to all affected parties (Schweickart, 2004). A software tool that assesses the on-ground risk due to deflection missions is under development. It will allow to study the accumulated on-ground risk along the path of the projected impact point. The tool will help determine a deflection mission design that minimizes the on-ground casualty and damage risk due to deflection operations. Currently, the tool is capable of simulating asteroid trajectories through the solar system and considers gravitational forces between solar system bodies. A virtual asteroid may be placed at an arbitrary point in the simulation for analysis and manipulation. Furthermore, the tool determines the asteroid’s point of impact and provides an estimate of the population at risk. Validation has been conducted against the solar system ephemeris catalogue HORIZONS by NASA’s Jet Propulsion Laboratory (JPL). Asteroids that are propagated over a period of 15 years show typical position discrepancies of 0.05 Earth radii relative to HORIZONS’ output. Ultimately, results from this research will aid in the identification of requirements for deflection missions that enable effective, minimum risk asteroid deflection. Schweickart, R. L. (2004). THE REAL DEFLECTION DILEMMA. In 2004 Planetary Defense Conference: Protecting Earth from Asteroids (pp. 1-6). Orange County, California. Retrieved from http://b612foundation.org/wp-content/uploads/2013/02/Real_Deflection_Dilemma.pdf

  12. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series

    PubMed Central

    2011-01-01

    Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html. PMID:21851598

  13. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series.

    PubMed

    Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp

    2011-08-18

    Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  14. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  15. Determining the direct upland hydrological contribution area of estuarine wetlands using Arc/GIS tools

    EPA Science Inventory

    The delineation of a polygon layer representing the direct upland runoff contribution to esturine wetland polygons can be a useful tool in estuarine wetland assessment. However, the traditional methods of watershed delineation using pour points and digital elevation models (DEMs)...

  16. Accessing care summaries at point-of-care:Implementation of mobile devices for personal carers in aged care.

    PubMed

    Brimelow, Rachel E; Gibney, Annie; Meakin, Suzanne; Wollin, Judy A

    2017-04-01

    Continued development of mobile technology now allows access to information at the point-of-care. This study was conducted to evaluate the use of one such tool on a mobile device, from the carer perspective. Caregivers across 12 aged-care facilities were supplied mobile devices to access a Picture Care Plan (PCP), a specific tool designed around the role of the personal carer. An anonymous questionnaire was subsequently completed by 85 carers with questions relating to participants' experience. Perceived helpfulness of the PCP at the point-of-care was high (87%). A significant number of participants believed the use of the PCP increased resident safety and quality of care (76%). Practical components related to the carrying of the device, network speed and the requirement to maintain communication with senior members of staff to ascertain updates were also expressed by participants. Findings suggest that staff are receptive to adoption of mobile devices to access care directives at the point-of-care and that the technology is useful.

  17. Lift-based up-ender and methods using same to manipulate a shipping container containing unirradiated nuclear fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilles, Michael J.

    A shipping container containing an unirradiated nuclear fuel assembly is lifted off the ground by operating a crane to raise a lifting tool comprising a winch. The lifting tool is connected with the shipping container by a rigging line connecting with the shipping container at a lifting point located on the shipping container between the top and bottom of the shipping container, and by winch cabling connecting with the shipping container at the top of the shipping container. The shipping container is reoriented by operating the winch to adjust the length of the winch cabling so as to rotate themore » shipping container about the lifting point. Shortening the winch cabling rotates the shipping container about the lifting point from a horizontal orientation to a vertical orientation, while lengthening the winch cabling rotates the shipping container about the lifting point from the vertical orientation to the horizontal orientation.« less

  18. Virtual and Printed 3D Models for Teaching Crystal Symmetry and Point Groups

    ERIC Educational Resources Information Center

    Casas, Lluís; Estop, Euge`nia

    2015-01-01

    Both, virtual and printed 3D crystal models can help students and teachers deal with chemical education topics such as symmetry and point groups. In the present paper, two freely downloadable tools (interactive PDF files and a mobile app) are presented as examples of the application of 3D design to study point-symmetry. The use of 3D printing to…

  19. Flight Awareness Collaboration Tool Development

    NASA Technical Reports Server (NTRS)

    Mogford, Richard

    2016-01-01

    This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.

  20. GIS learning tool for world's largest earthquakes and their causes

    NASA Astrophysics Data System (ADS)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and interactive while educating students about a very important natural disaster which has been threatening us in recent years. This tool has been developed to increase awareness of the cause and effect of earthquake and how to be safe if that kind of disaster happens.

  1. Delineating Beach and Dune Morphology from Massive Terrestrial Laser Scanning Data Using the Generic Mapping Tools

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Wang, G.; Yan, B.; Kearns, T.

    2016-12-01

    Terrestrial laser scanning (TLS) techniques have been proven to be efficient tools to collect three-dimensional high-density and high-accuracy point clouds for coastal research and resource management. However, the processing and presenting of massive TLS data is always a challenge for research when targeting a large area with high-resolution. This article introduces a workflow using shell-scripting techniques to chain together tools from the Generic Mapping Tools (GMT), Geographic Resources Analysis Support System (GRASS), and other command-based open-source utilities for automating TLS data processing. TLS point clouds acquired in the beach and dune area near Freeport, Texas in May 2015 were used for the case study. Shell scripts for rotating the coordinate system, removing anomalous points, assessing data quality, generating high-accuracy bare-earth DEMs, and quantifying beach and sand dune features (shoreline, cross-dune section, dune ridge, toe, and volume) are presented in this article. According to this investigation, the accuracy of the laser measurements (distance from the scanner to the targets) is within a couple of centimeters. However, the positional accuracy of TLS points with respect to a global coordinate system is about 5 cm, which is dominated by the accuracy of GPS solutions for obtaining the positions of the scanner and reflector. The accuracy of TLS-derived bare-earth DEM is primarily determined by the size of grid cells and roughness of the terrain surface for the case study. A DEM with grid cells of 4m x 1m (shoreline by cross-shore) provides a suitable spatial resolution and accuracy for deriving major beach and dune features.

  2. A study of unstable slopes in permafrost areas : Alaskan case studies used as a training tool.

    DOT National Transportation Integrated Search

    2011-12-01

    This report is the companion to the PowerPoint presentation for the project A Study of Unstable Slopes in Permafrost: Alaskan Case Studies Used as a Training Tool. The objectives of this study are 1) to provide a comprehensive review of literat...

  3. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  4. Martian Atmospheric Pressure Static Charge Elimination Tool

    NASA Technical Reports Server (NTRS)

    Johansen, Michael R.

    2014-01-01

    A Martian pressure static charge elimination tool is currently in development in the Electrostatics and Surface Physics Laboratory (ESPL) at NASA's Kennedy Space Center. In standard Earth atmosphere conditions, static charge can be neutralized from an insulating surface using air ionizers. These air ionizers generate ions through corona breakdown. The Martian atmosphere is 7 Torr of mostly carbon dioxide, which makes it inherently difficult to use similar methods as those used for standard atmosphere static elimination tools. An initial prototype has been developed to show feasibility of static charge elimination at low pressure, using corona discharge. A needle point and thin wire loop are used as the corona generating electrodes. A photo of the test apparatus is shown below. Positive and negative high voltage pulses are sent to the needle point. This creates positive and negative ions that can be used for static charge neutralization. In a preliminary test, a floating metal plate was charged to approximately 600 volts under Martian atmospheric conditions. The static elimination tool was enabled and the voltage on the metal plate dropped rapidly to -100 volts. This test data is displayed below. Optimization is necessary to improve the electrostatic balance of the static elimination tool.

  5. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    PubMed

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  6. Utilizing the Iterative Closest Point (ICP) algorithm for enhanced registration of high resolution surface models - more than a simple black-box application

    NASA Astrophysics Data System (ADS)

    Stöcker, Claudia; Eltner, Anette

    2016-04-01

    Advances in computer vision and digital photogrammetry (i.e. structure from motion) allow for fast and flexible high resolution data supply. Within geoscience applications and especially in the field of small surface topography, high resolution digital terrain models and dense 3D point clouds are valuable data sources to capture actual states as well as for multi-temporal studies. However, there are still some limitations regarding robust registration and accuracy demands (e.g. systematic positional errors) which impede the comparison and/or combination of multi-sensor data products. Therefore, post-processing of 3D point clouds can heavily enhance data quality. In this matter the Iterative Closest Point (ICP) algorithm represents an alignment tool which iteratively minimizes distances of corresponding points within two datasets. Even though tool is widely used; it is often applied as a black-box application within 3D data post-processing for surface reconstruction. Aiming for precise and accurate combination of multi-sensor data sets, this study looks closely at different variants of the ICP algorithm including sub-steps of point selection, point matching, weighting, rejection, error metric and minimization. Therefore, an agricultural utilized field was investigated simultaneously by terrestrial laser scanning (TLS) and unmanned aerial vehicle (UAV) sensors two times (once covered with sparse vegetation and once bare soil). Due to different perspectives both data sets show diverse consistency in terms of shadowed areas and thus gaps so that data merging would provide consistent surface reconstruction. Although photogrammetric processing already included sub-cm accurate ground control surveys, UAV point cloud exhibits an offset towards TLS point cloud. In order to achieve the transformation matrix for fine registration of UAV point clouds, different ICP variants were tested. Statistical analyses of the results show that final success of registration and therefore data quality depends particularly on parameterization and choice of error metric, especially for erroneous data sets as in the case of sparse vegetation cover. At this, the point-to-point metric is more sensitive to data "noise" than the point-to-plane metric which results in considerably higher cloud-to-cloud distances. Concluding, in order to comply with accuracy demands of high resolution surface reconstruction and the aspect that ground control surveys can reach their limits both in time exposure and terrain accessibility ICP algorithm represents a great tool to refine rough initial alignment. Here different variants of registration modules allow for individual application according to the quality of the input data.

  7. Information Power Grid (IPG) Tutorial 2003

    NASA Technical Reports Server (NTRS)

    Meyers, George

    2003-01-01

    For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.

  8. SPARSKIT: A basic tool kit for sparse matrix computations

    NASA Technical Reports Server (NTRS)

    Saad, Youcef

    1990-01-01

    Presented here are the main features of a tool package for manipulating and working with sparse matrices. One of the goals of the package is to provide basic tools to facilitate the exchange of software and data between researchers in sparse matrix computations. The starting point is the Harwell/Boeing collection of matrices for which the authors provide a number of tools. Among other things, the package provides programs for converting data structures, printing simple statistics on a matrix, plotting a matrix profile, and performing linear algebra operations with sparse matrices.

  9. Constrained maximum likelihood modal parameter identification applied to structural dynamics

    NASA Astrophysics Data System (ADS)

    El-Kafafy, Mahmoud; Peeters, Bart; Guillaume, Patrick; De Troyer, Tim

    2016-05-01

    A new modal parameter estimation method to directly establish modal models of structural dynamic systems satisfying two physically motivated constraints will be presented. The constraints imposed in the identified modal model are the reciprocity of the frequency response functions (FRFs) and the estimation of normal (real) modes. The motivation behind the first constraint (i.e. reciprocity) comes from the fact that modal analysis theory shows that the FRF matrix and therefore the residue matrices are symmetric for non-gyroscopic, non-circulatory, and passive mechanical systems. In other words, such types of systems are expected to obey Maxwell-Betti's reciprocity principle. The second constraint (i.e. real mode shapes) is motivated by the fact that analytical models of structures are assumed to either be undamped or proportional damped. Therefore, normal (real) modes are needed for comparison with these analytical models. The work done in this paper is a further development of a recently introduced modal parameter identification method called ML-MM that enables us to establish modal model that satisfies such motivated constraints. The proposed constrained ML-MM method is applied to two real experimental datasets measured on fully trimmed cars. This type of data is still considered as a significant challenge in modal analysis. The results clearly demonstrate the applicability of the method to real structures with significant non-proportional damping and high modal densities.

  10. Interspecies differences and variability with time of protein precipitation activity of extractable tannins, crude protein, ash, and dry matter content of leaves from 13 species of Nepalese fodder trees.

    PubMed

    Wood, C D; Tiwari, B N; Plumb, V E; Powell, C J; Roberts, B T; Padmini Sirimane, V D; Rossiter, J T; Gill, M

    1994-12-01

    Dry matter, ash, crude protein, and protein precipitation activity (PPA) of 13 Nepalese tree fodder species were monitored in dried samples prepared monthly between November 1990 and May 1991, and additionally in November 1991, covering the season when they are particularly important as fodder. Monthly levels of dry matter, ash, and crude protein were fairly stable except when there was new leaf growth, although year to year differences in dry matter were found inBrassaiopsis hainla (Bh),Dendrocalamus strictus (Ds),Ficus roxburghii (Fr), andQuercus semecarpifolia (Qs). Tannin PPA fluctuated considerably inArtocarpus lakoocha (Al),Ficus glaberrima (Fg),F. nerrifolia (Fn), Fr,F. semicordata (Fs),Litsea polyantha (Lp), andPrunus cerasoides (Pc), and to a lesser extent in Bh,Castanopsis indica (Ci),C. tribuloides (Ct),Quercus lamellosa (Ql), and Qs. Similar fluctuations in PPA were observed in fresh leaf samples taken weekly. Ds did not have any detectable PPA. Trends in PPA fluctuation were generally similar for trees located at similar altitudes. Fr, Pc, Al, Fn, Ql, and Ci had falling PPAs before shedding leaves. Some of the fluctuations in Fr, Fs, Fg, Pc, and Lp were apparently due to changes in the extractability and quantity of condensed tannins. These fluctuations in PPA may affect the nutritive value of the fodders.

  11. Composite Payload Fairing Structural Architecture Assessment and Selection

    NASA Technical Reports Server (NTRS)

    Krivanek, Thomas M.; Yount, Bryan C.

    2012-01-01

    This paper provides a summary of the structural architecture assessments conducted and a recommendation for an affordable high performance composite structural concept to use on the next generation heavy-lift launch vehicle, the Space Launch System (SLS). The Structural Concepts Element of the Advanced Composites Technology (ACT) project and its follow on the Lightweight Spacecraft Structures and Materials (LSSM) project was tasked with evaluating a number of composite construction technologies for specific Ares V components: the Payload Shroud, the Interstage, and the Core Stage Intertank. Team studies strived to address the structural challenges, risks and needs for each of these vehicle components. Leveraging off of this work, the subsequent Composites for Exploration (CoEx) effort is focused on providing a composite structural concept to support the Payload Fairing for SLS. This paper documents the evaluation and down selection of composite construction technologies and evolution to the SLS Payload Fairing. Development of the evaluation criteria (also referred to as Figures of Merit or FOMs), their relative importance, and association to vehicle requirements are presented. A summary of the evaluation results, and a recommendation of the composite concept to baseline in the Composites for Exploration (CoEx) project is presented. The recommendation for the SLS Fairing is a Honeycomb Sandwich architecture based primarily on affordability and performance with two promising alternatives, Hat stiffened and Fiber Reinforced Foam (FRF) identified for eventual program block upgrade.

  12. Microinstabilities in the pedestal region

    NASA Astrophysics Data System (ADS)

    Dickinson, David; Dudson, Benjamin; Wilson, Howard; Roach, Colin

    2014-10-01

    The regulation of transport at the pedestal top is important for the inter-ELM pedestal dynamics. Linear gyrokinetic analysis of the pedestal region during an ELM cycle on MAST has shown kinetic ballooning modes to be unstable at the knee of the pressure profile and in the steep pedestal region whilst microtearing modes (MTMs) dominate in the shallow gradient region inboard of the pedestal top. The transition between these instabilities at the pedestal knee has been observed in low and high collisionality MAST pedestals, and is likely to play an important role in the broadening of the pedestal. Nonlinear simulations are needed in this region to understand the microturbulence, the corresponding transport fluxes, and to gain further insight into the processes underlying the pedestal evolution. Such gyrokinetic simulations are numerically challenging and recent upgrades to the GS2 gyrokinetic code help improve their feasibility. We are also exploring reduced models that capture the relevant physics using the plasma simulation framework BOUT + + . An electromagnetic gyrofluid model has recently been implemented with BOUT + + that has significantly reduced computational cost compared to the gyrokinetic simulations against which it will be benchmarked. This work was funded by the RCUK Energy programme, EURATOM and a EUROFusion fellowship WP14-FRF-CCFE/Dickinson and was carried out using: HELIOS at IFERC, Japan; ARCHER (EPSRC Grant No. EP/L000237/1); HECToR (EPSRC Grant No. EP/H002081/1).

  13. Identification of walking human model using agent-based modelling

    NASA Astrophysics Data System (ADS)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  14. Identification of material properties of orthotropic composite plate using experimental frequency response function data

    NASA Astrophysics Data System (ADS)

    Tam, Jun Hui; Ong, Zhi Chao; Ismail, Zubaidah; Ang, Bee Chin; Khoo, Shin Yee

    2018-05-01

    The demand for composite materials is increasing due to their great superiority in material properties, e.g., lightweight, high strength and high corrosion resistance. As a result, the invention of composite materials of diverse properties is becoming prevalent, and thus, leading to the development of material identification methods for composite materials. Conventional identification methods are destructive, time-consuming and costly. Therefore, an accurate identification approach is proposed to circumvent these drawbacks, involving the use of Frequency Response Function (FRF) error function defined by the correlation discrepancy between experimental and Finite-Element generated FRFs. A square E-glass epoxy composite plate is investigated under several different configurations of boundary conditions. It is notable that the experimental FRFs are used as the correlation reference, such that, during computation, the predicted FRFs are continuously updated with reference to the experimental FRFs until achieving a solution. The final identified elastic properties, namely in-plane elastic moduli, Ex and Ey, in-plane shear modulus, Gxy, and major Poisson's ratio, vxy of the composite plate are subsequently compared to the benchmark parameters as well as with those obtained using modal-based approach. As compared to the modal-based approach, the proposed method is found to have yielded relatively better results. This can be explained by the direct employment of raw data in the proposed method that avoids errors that might incur during the stage of modal extraction.

  15. Efficient data assimilation algorithm for bathymetry application

    NASA Astrophysics Data System (ADS)

    Ghorbanidehno, H.; Lee, J. H.; Farthing, M.; Hesser, T.; Kitanidis, P. K.; Darve, E. F.

    2017-12-01

    Information on the evolving state of the nearshore zone bathymetry is crucial to shoreline management, recreational safety, and naval operations. The high cost and complex logistics of using ship-based surveys for bathymetry estimation have encouraged the use of remote sensing techniques. Data assimilation methods combine the remote sensing data and nearshore hydrodynamic models to estimate the unknown bathymetry and the corresponding uncertainties. In particular, several recent efforts have combined Kalman Filter-based techniques such as ensembled-based Kalman filters with indirect video-based observations to address the bathymetry inversion problem. However, these methods often suffer from ensemble collapse and uncertainty underestimation. Here, the Compressed State Kalman Filter (CSKF) method is used to estimate the bathymetry based on observed wave celerity. In order to demonstrate the accuracy and robustness of the CSKF method, we consider twin tests with synthetic observations of wave celerity, while the bathymetry profiles are chosen based on surveys taken by the U.S. Army Corps of Engineer Field Research Facility (FRF) in Duck, NC. The first test case is a bathymetry estimation problem for a spatially smooth and temporally constant bathymetry profile. The second test case is a bathymetry estimation problem for a temporally evolving bathymetry from a smooth to a non-smooth profile. For both problems, we compare the results of CSKF with those obtained by the local ensemble transform Kalman filter (LETKF), which is a popular ensemble-based Kalman filter method.

  16. Caracterisation mecanique dynamique de materiaux poro-visco-elastiques

    NASA Astrophysics Data System (ADS)

    Renault, Amelie

    Poro-viscoelastic materials are well modelled with Biot-Allard equations. This model needs a number of geometrical parameters in order to describe the macroscopic geometry of the material and elastic parameters in order to describe the elastic properties of the material skeleton. Several characterisation methods of viscoelastic parameters of porous materials are studied in this thesis. Firstly, quasistatic and resonant characterization methods are described and analyzed. Secondly, a new inverse dynamic characterization of the same modulus is developed. The latter involves a two layers metal-porous beam, which is excited at the center. The input mobility is measured. The set-up is simplified compared to previous methods. The parameters are obtained via an inversion procedure based on the minimisation of the cost function comparing the measured and calculated frequency response functions (FRF). The calculation is done with a general laminate model. A parametric study identifies the optimal beam dimensions for maximum sensitivity of the inversion model. The advantage of using a code which is not taking into account fluid-structure interactions is the low computation time. For most materials, the effect of this interaction on the elastic properties is negligible. Several materials are tested to demonstrate the performance of the method compared to the classical quasi-static approaches, and set its limitations and range of validity. Finally, conclusions about their utilisation are given. Keywords. Elastic parameters, porous materials, anisotropy, vibration.

  17. Experimental validation of a numerical 3-D finite model applied to wind turbines design under vibration constraints: TREVISE platform

    NASA Astrophysics Data System (ADS)

    Sellami, Takwa; Jelassi, Sana; Darcherif, Abdel Moumen; Berriri, Hanen; Mimouni, Med Faouzi

    2018-04-01

    With the advancement of wind turbines towards complex structures, the requirement of trusty structural models has become more apparent. Hence, the vibration characteristics of the wind turbine components, like the blades and the tower, have to be extracted under vibration constraints. Although extracting the modal properties of blades is a simple task, calculating precise modal data for the whole wind turbine coupled to its tower/foundation is still a perplexing task. In this framework, this paper focuses on the investigation of the structural modeling approach of modern commercial micro-turbines. Thus, the structural model a complex designed wind turbine, which is Rutland 504, is established based on both experimental and numerical methods. A three-dimensional (3-D) numerical model of the structure was set up based on the finite volume method (FVM) using the academic finite element analysis software ANSYS. To validate the created model, experimental vibration tests were carried out using the vibration test system of TREVISE platform at ECAM-EPMI. The tests were based on the experimental modal analysis (EMA) technique, which is one of the most efficient techniques for identifying structures parameters. Indeed, the poles and residues of the frequency response functions (FRF), between input and output spectra, were calculated to extract the mode shapes and the natural frequencies of the structure. Based on the obtained modal parameters, the numerical designed model was up-dated.

  18. Evaluating online diagnostic decision support tools for the clinical setting.

    PubMed

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  19. [Going into the 21st century: should one dream or act?].

    PubMed

    Coosemans, M

    1991-01-01

    A historical review of vector control is made. Despite the available tools, vector borne diseases are still a priority in Public Health. Magic tools, like DDT, were often misused. Adapted strategies and structures for vector control are now required. Progress will mainly result from research and evaluation done in the framework of vector control programmes. Discovery of new tools will find in these operational programmes a point of fall for their application.

  20. The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes

    DTIC Science & Technology

    2012-05-17

    defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with

  1. Metal Vapor Arcing Risk Assessment Tool

    NASA Technical Reports Server (NTRS)

    Hill, Monika C.; Leidecker, Henning W.

    2010-01-01

    The Tin Whisker Metal Vapor Arcing Risk Assessment Tool has been designed to evaluate the risk of metal vapor arcing and to help facilitate a decision toward a researched risk disposition. Users can evaluate a system without having to open up the hardware. This process allows for investigating components at risk rather than spending time and money analyzing every component. The tool points to a risk level and provides direction for appropriate action and documentation.

  2. Quality Evaluation Tool for Computer-and Web-Delivered Instruction

    DTIC Science & Technology

    2005-06-01

    Bryman , A ., Mars, R., & Tapangco, L. (1996). When less is more: Meaningful learning from visual and verbal summaries of science textbook lessons...is unlimited. " A " 13. ABSTRACT (Maximum 200 words) The objective of this effort was to develop an Instructional Quality Evaluation Tool to help...developed for each rating point on all scales. This report includes these anchored Likert scales, which can serve as a "stand-alone" Tool. The

  3. What Chemists (or Chemistry Students) Need to Know about Computing.

    ERIC Educational Resources Information Center

    Swift, Mary L.; Zielinski, Theresa Julia

    1995-01-01

    Presents key points of an on-line conference discussion and integrates them with information from the literature. Key points included: computer as a tool for learning, study, research, and communication; hardware, software, computing concepts, and other teaching concerns; and the appropriate place for chemistry computer-usage instruction. (45…

  4. Digital Ink: In-Class Annotation of PowerPoint Lectures

    ERIC Educational Resources Information Center

    Johnson, Anne E.

    2008-01-01

    Digital ink is a tool that, in conjunction with Microsoft PowerPoint software, allows real-time freehand annotation of presentations. Annotation of slides during class encourages student engagement with the material and problems under discussion. Digital ink annotation is a technique suitable for teaching across many disciplines, but is especially…

  5. Understanding the Process of Contextualization

    ERIC Educational Resources Information Center

    Wyatt, Tasha

    2015-01-01

    The literature on culture and education points to the importance of using students' cultural knowledge in the teaching and learning process. While the theory of culturally relevant education has expanded in the last several decades, the practical implementation continues to lag far behind. This disparity points to the lack of tools and other…

  6. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  7. Single point diamond crushing of glass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, J.B.; Carter, D.L.; Clouser, R.W.

    1984-05-23

    Single point diamond crushing of glass was originally developed by Dr. R.E. Reason of Taylor and Hobson in England 34 years ago as a means of shaping glass aspheres prior to polishing. It has recently been tried at LLNL. A surface finish of 50 microinches peak-to-valley with occasional deeper pits has been achieved on Zerodur and BK-7 glass. A depth of cut of 0.008 inch or more can be taken at a surface speed of 900 feet per minute. Tool wear is on the order of 10 microinches after removal of one cubic inch of Zerodur. The tool's cost ismore » $5.45 each.« less

  8. Tool mark striations in pig skin produced by stabs from a serrated blade.

    PubMed

    Pounder, Derrick J; Bhatt, Shivani; Cormack, Lesley; Hunt, Bill A C

    2011-03-01

    Stab wounds produced by serrated blades are generally indistinguishable from stab wounds produced by non-serrated blades, except when visible tool mark striations are left on severed cartilage. Using a pig-skin experimental model, we explored the possibility that similar striations may be left in skin. Stabs into pig skin were made using a straight spine coarsely serrated blade (121), a drop point finely serrated blade (20), a clip point irregular coarsely serrated blade (20), a drop point coarsely serrated blade (15), and as controls 2 non-serrated blades (40). Tool mark striations could be seen on the skin wall of the stab canal in all stabs made using serrated blades but in none with non-serrated blades.The striation pattern, reflecting the class characteristics of the serrated blade, was the same as that described in cartilage but less well defined. Fixation of the specimen with Carnoy's solution best preserved visible striations, and fixation with formaldehyde after staining with 5% Neutral Red was also satisfactory. Casting with vinyl polysiloxane dental impression material greatly facilitated photo-documentation. Applying the technique to homicidal stabbings may help identify stab wounds produced with serrated blades.

  9. FireProt: web server for automated design of thermostable proteins

    PubMed Central

    Musil, Milos; Stourac, Jan; Brezovsky, Jan; Prokop, Zbynek; Zendulka, Jaroslav; Martinek, Tomas

    2017-01-01

    Abstract There is a continuous interest in increasing proteins stability to enhance their usability in numerous biomedical and biotechnological applications. A number of in silico tools for the prediction of the effect of mutations on protein stability have been developed recently. However, only single-point mutations with a small effect on protein stability are typically predicted with the existing tools and have to be followed by laborious protein expression, purification, and characterization. Here, we present FireProt, a web server for the automated design of multiple-point thermostable mutant proteins that combines structural and evolutionary information in its calculation core. FireProt utilizes sixteen tools and three protein engineering strategies for making reliable protein designs. The server is complemented with interactive, easy-to-use interface that allows users to directly analyze and optionally modify designed thermostable mutants. FireProt is freely available at http://loschmidt.chemi.muni.cz/fireprot. PMID:28449074

  10. Fluorescent nucleobases as tools for studying DNA and RNA

    NASA Astrophysics Data System (ADS)

    Xu, Wang; Chan, Ke Min; Kool, Eric T.

    2017-11-01

    Understanding the diversity of dynamic structures and functions of DNA and RNA in biology requires tools that can selectively and intimately probe these biomolecules. Synthetic fluorescent nucleobases that can be incorporated into nucleic acids alongside their natural counterparts have emerged as a powerful class of molecular reporters of location and environment. They are enabling new basic insights into DNA and RNA, and are facilitating a broad range of new technologies with chemical, biological and biomedical applications. In this Review, we will present a brief history of the development of fluorescent nucleobases and explore their utility as tools for addressing questions in biophysics, biochemistry and biology of nucleic acids. We provide chemical insights into the two main classes of these compounds: canonical and non-canonical nucleobases. A point-by-point discussion of the advantages and disadvantages of both types of fluorescent nucleobases is made, along with a perspective into the future challenges and outlook for this burgeoning field.

  11. Cardiovascular point of care initiative: enhancements in clinical data management.

    PubMed

    Robertson, Jane

    2003-01-01

    The Department of Cardiovascular Surgery at East Alabama Medical Center (EAMC) initiated a program in 1996 to improve the quality and usefulness of clinical outcomes data. After years of using a commercial vendor product and enduring a tedious collection process, the department decided to develop its own tools to support quality improvement efforts. Using a hand-held personal data assistant (PDA), the team developed tools that allowed ongoing data collection at the point of care delivery. The tools and methods facilitated the collection of real time, accurate information that allowed EAMC to participate in multiple clinical quality initiatives. The ability to conduct rapid-cycle performance improvement studies propelled EAMC's Cardiovascular Surgery Program into the Top 100 as recognized by HCIA, now Solucient, for 3 consecutive years (1999-2001). This report will describe the evolution of the data collection process as well as the quality improvements that resulted.

  12. A PDA-based instructional tool to monitor students' cardiac auscultation during a medicine clerkship.

    PubMed

    Torre, D M; Sebastian, J L; Simpson, D E

    2005-09-01

    Cardiac auscultation is an important skill for medical students to master but students' exposure to cardiac auscultation is often unmonitored. The objective of this study was to gather data at the point of care about students' cardiac auscultation experience on a required medicine rotation using a Personal Digital Assistant (PDA) 'murmur form'. During an eight-month period, 120 M3 students used the authors' PDA-based learning tool to record information on 940 heart sounds and murmurs. Some 93% of all heart sounds/murmurs reported by students were verified by either a faculty member (56%) or a supervising resident (43%). A PDA can be a useful tool to monitor students' experiences of cardiac auscultation and to track direct observation of such skills by faculty or residents. Medical students are eager to use technology at the point of care to practice their clinical skills.

  13. [Dietopro.com: a new tool for dietotherapeutical management based on cloud computing technology].

    PubMed

    García, Candido Gabriel; Sebastià, Natividad; Blasco, Esther; Soriano, José Miguel

    2014-09-01

    dietotherapeutical softwares are now a basic tool in the dietary management of patients, either from a physiological point of view and / or pathological. New technologies and research in this regard, have favored the emergence of new applications for the dietary and nutritional management that facilitate the management of the dietotherapeutical company. To comparatively study the main dietotherapeutical applications on the market to give criteria to the professional users of diet and nutrition in the selection of one of the main tools for these. Dietopro.com is, from our point of view, one of the most comprehensive management of patients dietotherapeutical applications. Based on the need of the user, it has different dietary sofwares choice.We conclude that there is no better or worse than another application, but applications roughly adapted to the needs of professionals. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  14. Spacecraft Station-Keeping Trajectory and Mission Design Tools

    NASA Technical Reports Server (NTRS)

    Chung, Min-Kun J.

    2009-01-01

    Two tools were developed for designing station-keeping trajectories and estimating delta-v requirements for designing missions to a small body such as a comet or asteroid. This innovation uses NPOPT, a non-sparse, general-purpose sequential quadratic programming (SQP) optimizer and the Two-Level Differential Corrector (T-LDC) in LTool (Libration point mission design Tool) to design three kinds of station-keeping scripts: vertical hovering, horizontal hovering, and orbiting. The T-LDC is used to differentially correct several trajectory legs that join hovering points. In a vertical hovering, the maximum and minimum range points must be connected smoothly while maintaining the spacecrafts range from a small body, all within the law of gravity and the solar radiation pressure. The same is true for a horizontal hover. A PatchPoint is an LTool class that denotes a space-time event with some extra information for differential correction, including a set of constraints to be satisfied by T-LDC. Given a set of PatchPoints, each with its own constraint, the T-LDC differentially corrects the entire trajectory by connecting each trajectory leg joined by PatchPoints while satisfying all specified constraints at the same time. Vertical and horizontal hover both are needed to minimize delta-v spent for station keeping. A Python I/F to NPOPT has been written to be used from an LTool script. In vertical hovering, the spacecraft stays along the line joining the Sun and a small body. An instantaneous delta-v toward the anti- Sun direction is applied at the closest approach to the small body for station keeping. For example, the spacecraft hovers between the minimum range (2 km) point and the maximum range (2.5 km) point from the asteroid 1989ML. Horizontal hovering buys more time for a spacecraft to recover if, for any reason, a planned thrust fails, by returning almost to the initial position after some time later via a near elliptical orbit around the small body. The mapping or staging orbit may be similarly generated using T-LDC with a set of constraints. Some delta-v tables are generated for several different asteroid masses.

  15. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    NASA Astrophysics Data System (ADS)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this study establish the feasibility and importance of including influential point detection diagnostics as a standard tool in hydrological model calibration. They provide the hydrologist with important information on whether model calibration is susceptible to a small number of highly influent data points. This enables the hydrologist to make a more informed decision of whether to (1) remove/retain the calibration data; (2) adjust the calibration strategy and/or hydrological model to reduce the susceptibility of model predictions to a small number of influential observations.

  16. A new optimization tool path planning for 3-axis end milling of free-form surfaces based on efficient machining intervals

    NASA Astrophysics Data System (ADS)

    Vu, Duy-Duc; Monies, Frédéric; Rubio, Walter

    2018-05-01

    A large number of studies, based on 3-axis end milling of free-form surfaces, seek to optimize tool path planning. Approaches try to optimize the machining time by reducing the total tool path length while respecting the criterion of the maximum scallop height. Theoretically, the tool path trajectories that remove the most material follow the directions in which the machined width is the largest. The free-form surface is often considered as a single machining area. Therefore, the optimization on the entire surface is limited. Indeed, it is difficult to define tool trajectories with optimal feed directions which generate largest machined widths. Another limiting point of previous approaches for effectively reduce machining time is the inadequate choice of the tool. Researchers use generally a spherical tool on the entire surface. However, the gains proposed by these different methods developed with these tools lead to relatively small time savings. Therefore, this study proposes a new method, using toroidal milling tools, for generating toolpaths in different regions on the machining surface. The surface is divided into several regions based on machining intervals. These intervals ensure that the effective radius of the tool, at each cutter-contact points on the surface, is always greater than the radius of the tool in an optimized feed direction. A parallel plane strategy is then used on the sub-surfaces with an optimal specific feed direction for each sub-surface. This method allows one to mill the entire surface with efficiency greater than with the use of a spherical tool. The proposed method is calculated and modeled using Maple software to find optimal regions and feed directions in each region. This new method is tested on a free-form surface. A comparison is made with a spherical cutter to show the significant gains obtained with a toroidal milling cutter. Comparisons with CAM software and experimental validations are also done. The results show the efficiency of the method.

  17. Development of a prototype clinical decision support tool for osteoporosis disease management: a qualitative study of focus groups.

    PubMed

    Kastner, Monika; Li, Jamy; Lottridge, Danielle; Marquez, Christine; Newton, David; Straus, Sharon E

    2010-07-22

    Osteoporosis affects over 200 million people worldwide, and represents a significant cost burden. Although guidelines are available for best practice in osteoporosis, evidence indicates that patients are not receiving appropriate diagnostic testing or treatment according to guidelines. The use of clinical decision support systems (CDSSs) may be one solution because they can facilitate knowledge translation by providing high-quality evidence at the point of care. Findings from a systematic review of osteoporosis interventions and consultation with clinical and human factors engineering experts were used to develop a conceptual model of an osteoporosis tool. We conducted a qualitative study of focus groups to better understand physicians' perceptions of CDSSs and to transform the conceptual osteoporosis tool into a functional prototype that can support clinical decision making in osteoporosis disease management at the point of care. The conceptual design of the osteoporosis tool was tested in 4 progressive focus groups with family physicians and general internists. An iterative strategy was used to qualitatively explore the experiences of physicians with CDSSs; and to find out what features, functions, and evidence should be included in a working prototype. Focus groups were conducted using a semi-structured interview guide using an iterative process where results of the first focus group informed changes to the questions for subsequent focus groups and to the conceptual tool design. Transcripts were transcribed verbatim and analyzed using grounded theory methodology. Of the 3 broad categories of themes that were identified, major barriers related to the accuracy and feasibility of extracting bone mineral density test results and medications from the risk assessment questionnaire; using an electronic input device such as a Tablet PC in the waiting room; and the importance of including well-balanced information in the patient education component of the osteoporosis tool. Suggestions for modifying the tool included the addition of a percentile graph showing patients' 10-year risk for osteoporosis or fractures, and ensuring that the tool takes no more than 5 minutes to complete. Focus group data revealed the facilitators and barriers to using the osteoporosis tool at the point of care so that it can be optimized to aid physicians in their clinical decision making.

  18. Development of a prototype clinical decision support tool for osteoporosis disease management: a qualitative study of focus groups

    PubMed Central

    2010-01-01

    Background Osteoporosis affects over 200 million people worldwide, and represents a significant cost burden. Although guidelines are available for best practice in osteoporosis, evidence indicates that patients are not receiving appropriate diagnostic testing or treatment according to guidelines. The use of clinical decision support systems (CDSSs) may be one solution because they can facilitate knowledge translation by providing high-quality evidence at the point of care. Findings from a systematic review of osteoporosis interventions and consultation with clinical and human factors engineering experts were used to develop a conceptual model of an osteoporosis tool. We conducted a qualitative study of focus groups to better understand physicians' perceptions of CDSSs and to transform the conceptual osteoporosis tool into a functional prototype that can support clinical decision making in osteoporosis disease management at the point of care. Methods The conceptual design of the osteoporosis tool was tested in 4 progressive focus groups with family physicians and general internists. An iterative strategy was used to qualitatively explore the experiences of physicians with CDSSs; and to find out what features, functions, and evidence should be included in a working prototype. Focus groups were conducted using a semi-structured interview guide using an iterative process where results of the first focus group informed changes to the questions for subsequent focus groups and to the conceptual tool design. Transcripts were transcribed verbatim and analyzed using grounded theory methodology. Results Of the 3 broad categories of themes that were identified, major barriers related to the accuracy and feasibility of extracting bone mineral density test results and medications from the risk assessment questionnaire; using an electronic input device such as a Tablet PC in the waiting room; and the importance of including well-balanced information in the patient education component of the osteoporosis tool. Suggestions for modifying the tool included the addition of a percentile graph showing patients' 10-year risk for osteoporosis or fractures, and ensuring that the tool takes no more than 5 minutes to complete. Conclusions Focus group data revealed the facilitators and barriers to using the osteoporosis tool at the point of care so that it can be optimized to aid physicians in their clinical decision making. PMID:20650007

  19. Process Parameters Optimization in Single Point Incremental Forming

    NASA Astrophysics Data System (ADS)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  20. Blogging as an Instructional Tool in the ESL Classroom

    ERIC Educational Resources Information Center

    Featro, Susan Mary; DiGregorio, Daniela

    2016-01-01

    Theories on emerging technologies have stated that using blogs in the classroom can engage students in discussion, support peer learning, and improve students' literacy skills. Research has pointed to many ways that blogging is beneficial to student learning when used as an instructional tool. The researchers conducted a project that investigated…

  1. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  2. The Puppet's Communicative Potential as a Mediating Tool in Preschool Education

    ERIC Educational Resources Information Center

    Ahlcrona, Mirella Forsberg

    2012-01-01

    This article describes a puppet as a mediating tool in early childhood education and the puppet's communicative properties, potential and use in preschool. In the empirical section, the puppet consists and functions as a starting point for children's interaction, narratives and different ways of communication. The research interest is directed…

  3. Multidisciplinary Analysis of a Hypersonic Engine

    NASA Technical Reports Server (NTRS)

    Suresh, Ambady; Stewart, Mark

    2003-01-01

    The objective is to develop high fidelity tools that can influence ISTAR design In particular, tools for coupling Fluid-Thermal-Structural simulations RBCC/TBCC designers carefully balance aerodynamic, thermal, weight, & structural considerations; consistent multidisciplinary solutions reveal details (at modest cost) At Scram mode design point, simulations give details of inlet & combustor performance, thermal loads, structural deflections.

  4. The Application of Function Points to Predict Source Lines of Code for Software Development

    DTIC Science & Technology

    1992-09-01

    there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available

  5. Educational Leadership Effectiveness: A Rasch Analysis

    ERIC Educational Resources Information Center

    Sinnema, Claire; Ludlow, Larry; Robinson, Viviane

    2016-01-01

    Purpose: The purposes of this paper are, first, to establish the psychometric properties of the ELP tool, and, second, to test, using a Rasch item response theory analysis, the hypothesized progression of challenge presented by the items included in the tool. Design/ Methodology/ Approach: Data were collected at two time points through a survey of…

  6. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  7. A study on using pre-forming blank in single point incremental forming process by finite element analysis

    NASA Astrophysics Data System (ADS)

    Abass, K. I.

    2016-11-01

    Single Point Incremental Forming process (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The edges of sheet material are clamped while the forming tool is moved along the tool path. The CNC milling machine is used to manufacturing the product. SPIF involves extensive plastic deformation and the description of the process is more complicated by highly nonlinear boundary conditions, namely contact and frictional effects have been accomplished. However, due to the complex nature of these models, numerical approaches dominated by Finite Element Analysis (FEA) are now in widespread use. The paper presents the data and main results of a study on effect of using preforming blank in SPIF through FEA. The considered SPIF has been studied under certain process conditions referring to the test work piece, tool, etc., applying ANSYS 11. The results show that the simulation model can predict an ideal profile of processing track, the behaviour of contact tool-workpiece, the product accuracy by evaluation its thickness, surface strain and the stress distribution along the deformed blank section during the deformation stages.

  8. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  9. Influence of Fiber Orientation on Single-Point Cutting Fracture Behavior of Carbon-Fiber/Epoxy Prepreg Sheets.

    PubMed

    Wei, Yingying; An, Qinglong; Cai, Xiaojiang; Chen, Ming; Ming, Weiwei

    2015-10-02

    The purpose of this article is to investigate the influences of carbon fibers on the fracture mechanism of carbon fibers both in macroscopic view and microscopic view by using single-point flying cutting method. Cutting tools with three different materials were used in this research, namely, PCD (polycrystalline diamond) tool, CVD (chemical vapor deposition) diamond thin film coated carbide tool and uncoated carbide tool. The influence of fiber orientation on the cutting force and fracture topography were analyzed and conclusions were drawn that cutting forces are not affected by cutting speeds but significantly influenced by the fiber orientation. Cutting forces presented smaller values in the fiber orientation of 0/180° and 15/165° but the highest one in 30/150°. The fracture mechanism of carbon fibers was studied in different cutting conditions such as 0° orientation angle, 90° orientation angle, orientation angles along fiber direction, and orientation angles inverse to the fiber direction. In addition, a prediction model on the cutting defects of carbon fiber reinforced plastic was established based on acoustic emission (AE) signals.

  10. I-FORCAST: Rapid Flight Planning Tool

    NASA Technical Reports Server (NTRS)

    Oaida, Bogdan; Khan, Mohammed; Mercury, Michael B.

    2012-01-01

    I-FORCAST (Instrument - Field of Regard Coverage Analysis and Simulation Tool) is a flight planning tool specifically designed for quickly verifying the feasibility and estimating the cost of airborne remote sensing campaigns (see figure). Flights are simulated by being broken into three predefined routing algorithms as necessary: mapping in a snaking pattern, mapping the area around a point target (like a volcano) with a star pattern, and mapping the area between a list of points. The tool has been used to plan missions for radar, lidar, and in-situ atmospheric measuring instruments for a variety of aircraft. It has also been used for global and regional scale campaigns and automatically includes landings when refueling is required. The software has been compared to the flight times of known commercial aircraft route travel times, as well as a UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar) campaign, and was within 15% of the actual flight time. Most of the discrepancy is due to non-optimal flight paths taken by actual aircraft to avoid restricted airspace and used to follow landing and take-off corridors.

  11. Influence of Fiber Orientation on Single-Point Cutting Fracture Behavior of Carbon-Fiber/Epoxy Prepreg Sheets

    PubMed Central

    Wei, Yingying; An, Qinglong; Cai, Xiaojiang; Chen, Ming; Ming, Weiwei

    2015-01-01

    The purpose of this article is to investigate the influences of carbon fibers on the fracture mechanism of carbon fibers both in macroscopic view and microscopic view by using single-point flying cutting method. Cutting tools with three different materials were used in this research, namely, PCD (polycrystalline diamond) tool, CVD (chemical vapor deposition) diamond thin film coated carbide tool and uncoated carbide tool. The influence of fiber orientation on the cutting force and fracture topography were analyzed and conclusions were drawn that cutting forces are not affected by cutting speeds but significantly influenced by the fiber orientation. Cutting forces presented smaller values in the fiber orientation of 0/180° and 15/165° but the highest one in 30/150°. The fracture mechanism of carbon fibers was studied in different cutting conditions such as 0° orientation angle, 90° orientation angle, orientation angles along fiber direction, and orientation angles inverse to the fiber direction. In addition, a prediction model on the cutting defects of carbon fiber reinforced plastic was established based on acoustic emission (AE) signals. PMID:28793597

  12. Development of a fixation device for robot assisted fracture reduction of femoral shaft fractures: a biomechanical study.

    PubMed

    Weber-Spickschen, T S; Oszwald, M; Westphal, R; Krettek, C; Wahl, F; Gosling, T

    2010-01-01

    Robot assisted fracture reduction of femoral shaft fractures provides precise alignment while reducing the amount of intraoperative imaging. The connection between the robot and the fracture fragment should allow conventional intramedullary nailing, be minimally invasive and provide interim fracture stability. In our study we tested three different reduction tools: a conventional External Fixator, a Reposition-Plate and a Three-Point-Device with two variations (a 40 degrees and a 90 degrees version). We measured relative movements between the tools and the bone fragments in all translation and rotation planes. The Three-Point-Device 90 degrees showed the smallest average relative displacement and was the only device able to withstand the maximum applied load of 70 Nm without failure of any bone fragment. The Three-Point-Device 90 degrees complies with all the stipulated requirements and is a suitable interface for robot assisted fracture reduction of femoral shaft fractures.

  13. A reply to Sahle and Braun's reply to 'The pattern of emergence of a Middle Stone Age tradition at Gademotta and Kulkuletti (Ethiopia) through convergent tool and point technologies' [J. Hum. Evol. 91 (2016) 93-121].

    PubMed

    Douze, Katja; Delagnes, Anne; Rots, Veerle; Gravina, Brad

    2018-05-28

    Sahle and Braun's (in press) recent comments on our identification (Douze and Delagnes, 2016) of diachronic trends in Middle Stone Age point traditions in several lithic assemblages from the sites of Gademotta and Kulkuletti (Ethiopia) focuses on pointed tool function rather than the gradual technological shifts we observed between sites. Here we address several of what we consider to be inaccuracies and misinterpretations concerning our work with the Gademotta and Kulkuletti lithic assemblages (Douze, 2012, 2014), more specifically, Sahle and Braun's (in press) interpretation of the tranchet blow technique. This discussion is inseparable from a critical review of the evidence advanced by Sahle and Braun to support projectile technology being present in the Gademotta Formation as early as >279 ka. Copyright © 2018. Published by Elsevier Ltd.

  14. Identification and Analysis of National Airspace System Resource Constraints

    NASA Technical Reports Server (NTRS)

    Smith, Jeremy C.; Marien, Ty V.; Viken, Jeffery K.; Neitzke, Kurt W.; Kwa, Tech-Seng; Dollyhigh, Samuel M.; Fenbert, James W.; Hinze, Nicolas K.

    2015-01-01

    This analysis is the deliverable for the Airspace Systems Program, Systems Analysis Integration and Evaluation Project Milestone for the Systems and Portfolio Analysis (SPA) focus area SPA.4.06 Identification and Analysis of National Airspace System (NAS) Resource Constraints and Mitigation Strategies. "Identify choke points in the current and future NAS. Choke points refer to any areas in the en route, terminal, oceanic, airport, and surface operations that constrain actual demand in current and projected future operations. Use the Common Scenarios based on Transportation Systems Analysis Model (TSAM) projections of future demand developed under SPA.4.04 Tools, Methods and Scenarios Development. Analyze causes, including operational and physical constraints." The NASA analysis is complementary to a NASA Research Announcement (NRA) "Development of Tools and Analysis to Evaluate Choke Points in the National Airspace System" Contract # NNA3AB95C awarded to Logistics Management Institute, Sept 2013.

  15. Spatial Point Pattern Analysis of Neurons Using Ripley's K-Function in 3D

    PubMed Central

    Jafari-Mamaghani, Mehrdad; Andersson, Mikael; Krieger, Patrik

    2010-01-01

    The aim of this paper is to apply a non-parametric statistical tool, Ripley's K-function, to analyze the 3-dimensional distribution of pyramidal neurons. Ripley's K-function is a widely used tool in spatial point pattern analysis. There are several approaches in 2D domains in which this function is executed and analyzed. Drawing consistent inferences on the underlying 3D point pattern distributions in various applications is of great importance as the acquisition of 3D biological data now poses lesser of a challenge due to technological progress. As of now, most of the applications of Ripley's K-function in 3D domains do not focus on the phenomenon of edge correction, which is discussed thoroughly in this paper. The main goal is to extend the theoretical and practical utilization of Ripley's K-function and corresponding tests based on bootstrap resampling from 2D to 3D domains. PMID:20577588

  16. Nearshore Processes, Currents and Directional Wave Spectra Monitoring Using Coherent and Non-coherent Imaging Radars

    NASA Astrophysics Data System (ADS)

    Trizna, D.; Hathaway, K.

    2007-05-01

    Two new radar systems have been developed for real-time measurement of near-shore processes, and results are presented for measurements of ocean wave spectra, near-shore sand bar structure, and ocean currents. The first is a non-coherent radar based on a modified version of the Sitex radar family, with a data acquisition system designed around an ISR digital receiver card. The card operates in a PC computer with inputs from a Sitex radar modified for extraction of analogue signals for digitization. Using a 9' antenna and 25 kW transmit power system, data were collected during 2007 at the U.S. Army Corps of Engineers Field Research Facility (FRF), Duck, NC during winter and spring of 2007. The directional wave spectrum measurements made are based on using a sequence of 64 to 640 antenna rotations to form a snapshot series of radar images of propagating waves. A square window is extracted from each image, typically 64 x 64 pixels at 3-m resolution. Then ten sets of 64 windows are submitted to a three-dimensional Fast Fourier Transform process to generate radar image spectra in the frequency-wavenumber space. The relation between the radar image spectral intensity and wave spectral intensity derived from the FRF pressure gauge array was used for a test set of data, in order to establish a modulation transfer function (MTF) for each frequency component. For 640 rotations, 10 of such spectra are averaged for improved statistics. The wave spectrum so generated was compared for extended data sets beyond those used to establish the MTF, and those results are presented here. Some differences between the radar and pressure sensor data that are observed are found to be due to the influence of the wind field, as the radar echo image weakens for light winds. A model is developed to account for such an effect to improve the radar estimate of the directional wave spectrum. The radar ocean wave imagery is severely influenced only by extremely heavy rain-fall rates, so that acceptable quality were assured for most weather conditions on a diurnal basis using a modest tower height. A new coherent microwave radar has recently been developed by ISR and preliminary testing was conducted in the spring of 2007. The radar is based on the Quadrapus four-channel transceiver card, mixed up to microwave frequencies for pulse transmission and back down to base-band for reception. We use frequency-modulated pulse compression methods to obtain 3-m spatial resolution. A standard marine radar pedestal is used to house the microwave components, and rotating radar PPI images similar to marine radar images are obtained. Many of the methods used for the marine radar system have been transferred to the coherent imaging radar. New processing methods applied to the coherent data allow summing of radial velocity images to map mean currents in the near shore zone, such as rip currents. A pair of such radars operating with a few hundred meter separation can be used to map vector currents continuously in the near shore zone and in harbors on a timely basis. Results of preliminary testing of the system will be presented.

  17. GIS tool to locate major Sikh temples in USA

    NASA Astrophysics Data System (ADS)

    Sharma, Saumya

    This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.

  18. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.

  19. Quality assessment of expert answers to lay questions about cystic fibrosis from various language zones in Europe: the ECORN-CF project.

    PubMed

    d'Alquen, Daniela; De Boeck, Kris; Bradley, Judy; Vávrová, Věra; Dembski, Birgit; Wagner, Thomas O F; Pfalz, Annette; Hebestreit, Helge

    2012-02-06

    The European Centres of Reference Network for Cystic Fibrosis (ECORN-CF) established an Internet forum which provides the opportunity for CF patients and other interested people to ask experts questions about CF in their mother language. The objectives of this study were to: 1) develop a detailed quality assessment tool to analyze quality of expert answers, 2) evaluate the intra- and inter-rater agreement of this tool, and 3) explore changes in the quality of expert answers over the time frame of the project. The quality assessment tool was developed by an expert panel. Five experts within the ECORN-CF project used the quality assessment tool to analyze the quality of 108 expert answers published on ECORN-CF from six language zones. 25 expert answers were scored at two time points, one year apart. Quality of answers was also assessed at an early and later period of the project. Individual rater scores and group mean scores were analyzed for each expert answer. A scoring system and training manual were developed analyzing two quality categories of answers: content and formal quality. For content quality, the grades based on group mean scores for all raters showed substantial agreement between two time points, however this was not the case for the grades based on individual rater scores. For formal quality the grades based on group mean scores showed only slight agreement between two time points and there was also poor agreement between time points for the individual grades. The inter-rater agreement for content quality was fair (mean kappa value 0.232 ± 0.036, p < 0.001) while only slight agreement was observed for the grades of the formal quality (mean kappa value 0.105 ± 0.024, p < 0.001). The quality of expert answers was rated high (four language zones) or satisfactory (two language zones) and did not change over time. The quality assessment tool described in this study was feasible and reliable when content quality was assessed by a group of raters. Within ECORN-CF, the tool will help ensure that CF patients all over Europe have equal possibility of access to high quality expert advice on their illness. © 2012 d’Alquen et al; licensee BioMed Central Ltd.

  20. Meeting the Needs of Travel Clientele: Tried and True Strategies That Work.

    ERIC Educational Resources Information Center

    Blessing, Kathy; Whitney, Cherine

    This paper describes sources for meeting the information needs of travel clientele. Topics addressed include: (1) U.S. government Web sites; (2) collection development tools, including review journals, online bookstores, travel Web sites, and sources of point-by-point comparisons of guide books; (3) prominent guidebook series and publisher Web…

  1. Fostering Innovation Through Robotics Exploration

    DTIC Science & Technology

    2015-06-01

    16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This effort enhanced Robotics STEM activities by incorporating Cognitive tutors at key points to...make important mathematical decision or implement critical calculations. Program utilized Cognitive Tutor Authoring tools for designing problem...activities by incorporating cognitive tutors at key points to make important mathematical decision or implement critical calculations. The program

  2. Bayesian Estimation of Fugitive Methane Point Source Emission Rates from a SingleDownwind High-Frequency Gas Sensor

    EPA Science Inventory

    Bayesian Estimation of Fugitive Methane Point Source Emission Rates from a Single Downwind High-Frequency Gas Sensor With the tremendous advances in onshore oil and gas exploration and production (E&P) capability comes the realization that new tools are needed to support env...

  3. Integrating Wind Profiling Radars and Radiosonde Observations with Model Point Data to Develop a Decision Support Tool to Assess Upper-level Winds For Space Launch

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Flinn, Clay

    2012-01-01

    Launch directors need to know upper-level wind forecasts. We developed an Excel-based GUI to display upper-level winds: (1) Rawinsonde at CCAFS, (2) Wind profilers at KSC, (3) Model point data at CCAFS.

  4. Systems Thinking Tools as Applied to Community-Based Participatory Research: A Case Study

    ERIC Educational Resources Information Center

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R.; Weinreb-Welch, Laurie; Lengerich, Eugene J.

    2012-01-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points."…

  5. A Service Life Analysis of Coast Guard C-130 Aircraft

    DTIC Science & Technology

    2003-03-01

    the drawing tool bar. If you feel more comfortable with estimating at points, you could also use AutoShapes to identify your estimate at a particular...you feel more comfortable with estimating at points, you could also use AutoShapes to identify your estimate at a particular year. If you prefer to

  6. Beyond the buildingcentric approach: A vision for an integrated evaluation of sustainable buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conte, Emilia, E-mail: conte@poliba.it; Monno, Valeria, E-mail: vmonno@poliba.it

    2012-04-15

    The available sustainable building evaluation systems have produced a new environmental design paradigm. However, there is an increasing need to overcome the buildingcentric approach of these systems, in order to further exploit their innovate potential for sustainable building practices. The paper takes this challenge by developing a cross-scale evaluation approach focusing on the reliability of sustainable building design solutions for the context in which the building is situated. An integrated building-urban evaluation model is proposed based on the urban matrix, which is a conceptualisation of the built environment as a social-ecological system. The model aims at evaluating the sustainability ofmore » a building considering it as an active entity contributing to the resilience of the urban matrix. Few holistic performance indicators are used for evaluating such contribution, so expressing the building reliability. The discussion on the efficacy of the model shows that it works as a heuristic tool, supporting the acquisition of a better insight into the complexity which characterises the relationships between the building and the built environment sustainability. Shading new lights on the meaning of sustainable buildings, the model can play a positive role in innovating sustainable building design practices, thus complementing current evaluation systems. - Highlights: Black-Right-Pointing-Pointer We model an integrated building-urban evaluation approach. Black-Right-Pointing-Pointer The urban matrix represents the social-ecological functioning of the urban context. Black-Right-Pointing-Pointer We introduce the concept of reliability to evaluate sustainable buildings. Black-Right-Pointing-Pointer Holistic indicators express the building reliability. Black-Right-Pointing-Pointer The evaluation model works as heuristic tool and complements other tools.« less

  7. Digital Investigations of AN Archaeological Smart Point Cloud: a Real Time Web-Based Platform to Manage the Visualisation of Semantical Queries

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Hallot, P.; Van Wersch, L.; Luczfalvy Jancsó, A.; Billen, R.

    2017-05-01

    While virtual copies of the real world tend to be created faster than ever through point clouds and derivatives, their working proficiency by all professionals' demands adapted tools to facilitate knowledge dissemination. Digital investigations are changing the way cultural heritage researchers, archaeologists, and curators work and collaborate to progressively aggregate expertise through one common platform. In this paper, we present a web application in a WebGL framework accessible on any HTML5-compatible browser. It allows real time point cloud exploration of the mosaics in the Oratory of Germigny-des-Prés, and emphasises the ease of use as well as performances. Our reasoning engine is constructed over a semantically rich point cloud data structure, where metadata has been injected a priori. We developed a tool that directly allows semantic extraction and visualisation of pertinent information for the end users. It leads to efficient communication between actors by proposing optimal 3D viewpoints as a basis on which interactions can grow.

  8. Earth observing system instrument pointing control modeling for polar orbiting platforms

    NASA Technical Reports Server (NTRS)

    Briggs, H. C.; Kia, T.; Mccabe, S. A.; Bell, C. E.

    1987-01-01

    An approach to instrument pointing control performance assessment for large multi-instrument platforms is described. First, instrument pointing requirements and reference platform control systems for the Eos Polar Platforms are reviewed. Performance modeling tools including NASTRAN models of two large platforms, a modal selection procedure utilizing a balanced realization method, and reduced order platform models with core and instrument pointing control loops added are then described. Time history simulations of instrument pointing and stability performance in response to commanded slewing of adjacent instruments demonstrates the limits of tolerable slew activity. Simplified models of rigid body responses are also developed for comparison. Instrument pointing control methods required in addition to the core platform control system to meet instrument pointing requirements are considered.

  9. Diamond machine tool face lapping machine

    DOEpatents

    Yetter, H.H.

    1985-05-06

    An apparatus for shaping, sharpening and polishing diamond-tipped single-point machine tools. The isolation of a rotating grinding wheel from its driving apparatus using an air bearing and causing the tool to be shaped, polished or sharpened to be moved across the surface of the grinding wheel so that it does not remain at one radius for more than a single rotation of the grinding wheel has been found to readily result in machine tools of a quality which can only be obtained by the most tedious and costly processing procedures, and previously unattainable by simple lapping techniques.

  10. Development and Overview of CPAS Sasquatch Airdrop Landing Location Predictor Software

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin J.; Bernatovich, Michael A.

    2015-01-01

    The Capsule Parachute Assembly System (CPAS) is the parachute system for NASA's Orion spacecraft. CPAS is currently in the Engineering Development Unit (EDU) phase of testing. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish a release point from the aircraft that will ensure that the article and all items released from it during flight will land in a designated safe area. The Sasquatch footprint tool was developed to determine this safe release point and to predict the probable landing locations (footprints) of the payload and all released objects. In 2012, a new version of Sasquatch, called Sasquatch Polygons, was developed that significantly upgraded the capabilities of the footprint tool. Key improvements were an increase in the accuracy of the predictions, and the addition of an interface with the Debris Tool (DT), an in-flight debris avoidance tool for use on the test observation helicopter. Additional enhancements include improved data presentation for communication with test personnel and a streamlined code structure. This paper discusses the development, validation, and performance of Sasquatch Polygons, as well as its differences from the original Sasquatch footprint tool.

  11. A novel image toggle tool for comparison of serial mammograms: automatic density normalization and alignment-development of the tool and initial experience.

    PubMed

    Honda, Satoshi; Tsunoda, Hiroko; Fukuda, Wataru; Saida, Yukihisa

    2014-12-01

    The purpose is to develop a new image toggle tool with automatic density normalization (ADN) and automatic alignment (AA) for comparing serial digital mammograms (DMGs). We developed an ADN and AA process to compare the images of serial DMGs. In image density normalization, a linear interpolation was applied by taking two points of high- and low-brightness areas. The alignment was calculated by determining the point of the greatest correlation while shifting the alignment between the current and prior images. These processes were performed on a PC with a 3.20-GHz Xeon processor and 8 GB of main memory. We selected 12 suspected breast cancer patients who had undergone screening DMGs in the past. Automatic processing was retrospectively performed on these images. Two radiologists subjectively evaluated them. The process of the developed algorithm took approximately 1 s per image. In our preliminary experience, two images could not be aligned approximately. When they were aligned, image toggling allowed detection of differences between examinations easily. We developed a new tool to facilitate comparative reading of DMGs on a mammography viewing system. Using this tool for toggling comparisons might improve the interpretation efficiency of serial DMGs.

  12. A 14-item Mediterranean diet assessment tool and obesity indexes among high-risk subjects: the PREDIMED trial.

    PubMed

    Martínez-González, Miguel Angel; García-Arellano, Ana; Toledo, Estefanía; Salas-Salvadó, Jordi; Buil-Cosiales, Pilar; Corella, Dolores; Covas, Maria Isabel; Schröder, Helmut; Arós, Fernando; Gómez-Gracia, Enrique; Fiol, Miquel; Ruiz-Gutiérrez, Valentina; Lapetra, José; Lamuela-Raventos, Rosa Maria; Serra-Majem, Lluís; Pintó, Xavier; Muñoz, Miguel Angel; Wärnberg, Julia; Ros, Emilio; Estruch, Ramón

    2012-01-01

    Independently of total caloric intake, a better quality of the diet (for example, conformity to the Mediterranean diet) is associated with lower obesity risk. It is unclear whether a brief dietary assessment tool, instead of full-length comprehensive methods, can also capture this association. In addition to reduced costs, a brief tool has the interesting advantage of allowing immediate feedback to participants in interventional studies. Another relevant question is which individual items of such a brief tool are responsible for this association. We examined these associations using a 14-item tool of adherence to the Mediterranean diet as exposure and body mass index, waist circumference and waist-to-height ratio (WHtR) as outcomes. Cross-sectional assessment of all participants in the "PREvención con DIeta MEDiterránea" (PREDIMED) trial. 7,447 participants (55-80 years, 57% women) free of cardiovascular disease, but with either type 2 diabetes or ≥ 3 cardiovascular risk factors. Trained dietitians used both a validated 14-item questionnaire and a full-length validated 137-item food frequency questionnaire to assess dietary habits. Trained nurses measured weight, height and waist circumference. Strong inverse linear associations between the 14-item tool and all adiposity indexes were found. For a two-point increment in the 14-item score, the multivariable-adjusted differences in WHtR were -0.0066 (95% confidence interval, -0.0088 to -0.0049) for women and -0.0059 (-0.0079 to -0.0038) for men. The multivariable-adjusted odds ratio for a WHtR>0.6 in participants scoring ≥ 10 points versus ≤ 7 points was 0.68 (0.57 to 0.80) for women and 0.66 (0.54 to 0.80) for men. High consumption of nuts and low consumption of sweetened/carbonated beverages presented the strongest inverse associations with abdominal obesity. A brief 14-item tool was able to capture a strong monotonic inverse association between adherence to a good quality dietary pattern (Mediterranean diet) and obesity indexes in a population of adults at high cardiovascular risk.

  13. A 14-Item Mediterranean Diet Assessment Tool and Obesity Indexes among High-Risk Subjects: The PREDIMED Trial

    PubMed Central

    Martínez-González, Miguel Angel; García-Arellano, Ana; Toledo, Estefanía; Salas-Salvadó, Jordi; Buil-Cosiales, Pilar; Corella, Dolores; Covas, Maria Isabel; Schröder, Helmut; Arós, Fernando; Gómez-Gracia, Enrique; Fiol, Miquel; Ruiz-Gutiérrez, Valentina; Lapetra, José; Lamuela-Raventos, Rosa Maria; Serra-Majem, Lluís; Pintó, Xavier; Muñoz, Miguel Angel; Wärnberg, Julia; Ros, Emilio; Estruch, Ramón

    2012-01-01

    Objective Independently of total caloric intake, a better quality of the diet (for example, conformity to the Mediterranean diet) is associated with lower obesity risk. It is unclear whether a brief dietary assessment tool, instead of full-length comprehensive methods, can also capture this association. In addition to reduced costs, a brief tool has the interesting advantage of allowing immediate feedback to participants in interventional studies. Another relevant question is which individual items of such a brief tool are responsible for this association. We examined these associations using a 14-item tool of adherence to the Mediterranean diet as exposure and body mass index, waist circumference and waist-to-height ratio (WHtR) as outcomes. Design Cross-sectional assessment of all participants in the “PREvención con DIeta MEDiterránea” (PREDIMED) trial. Subjects 7,447 participants (55–80 years, 57% women) free of cardiovascular disease, but with either type 2 diabetes or ≥3 cardiovascular risk factors. Trained dietitians used both a validated 14-item questionnaire and a full-length validated 137-item food frequency questionnaire to assess dietary habits. Trained nurses measured weight, height and waist circumference. Results Strong inverse linear associations between the 14-item tool and all adiposity indexes were found. For a two-point increment in the 14-item score, the multivariable-adjusted differences in WHtR were −0.0066 (95% confidence interval, –0.0088 to −0.0049) for women and –0.0059 (–0.0079 to –0.0038) for men. The multivariable-adjusted odds ratio for a WHtR>0.6 in participants scoring ≥10 points versus ≤7 points was 0.68 (0.57 to 0.80) for women and 0.66 (0.54 to 0.80) for men. High consumption of nuts and low consumption of sweetened/carbonated beverages presented the strongest inverse associations with abdominal obesity. Conclusions A brief 14-item tool was able to capture a strong monotonic inverse association between adherence to a good quality dietary pattern (Mediterranean diet) and obesity indexes in a population of adults at high cardiovascular risk. PMID:22905215

  14. Multi-tool accessibility assessment of government department websites:a case-study with JKGAD.

    PubMed

    Ismail, Abid; Kuppusamy, K S; Nengroo, Ab Shakoor

    2017-08-02

    Nature of being accessible to all categories of users is one of the primary factors for enabling the wider reach of the resources published through World Wide Web. The accessibility of websites has been analyzed through W3C guidelines with the help of various tools. This paper presents a multi-tool accessibility assessment of government department websites belonging to the Indian state of Jammu and Kashmir. A comparative analysis of six accessibility tools is also presented with 14 different parameters. The accessibility analysis tools used in this study for analysis are aChecker, Cynthia Says, Tenon, wave, Mauve, and Hera. These tools provide us the results of selected websites accessibility status on Web Content Accessibility Guidelines (WCAG) 1.0 and 2.0. It was found that there are variations in accessibility analysis results when using different accessibility metrics to measure the accessibility of websites. In addition to this, we have identified the guidelines which have frequently been violated. It was observed that there is a need for incorporating the accessibility component features among the selected websites. This paper presents a set of suggestions to improve the accessibility status of these sites so that the information and services provided by these sites shall reach a wider spectrum of audience without any barrier. Implications for rehabilitation The following points indicates that this case study of JKGAD websites comes under Rehabilitation focused on Visually Impaired users. Due to the universal nature of web, it should be accessible to all according to WCAG guidelines framed by World Wide Web Consortium. In this paper we have identified multiple accessibility barriers for persons with visual impairment while browsing the Jammu and Kashmir Government websites. Multi-tool analysis has been done to pin-point the potential barriers for persons with visually Impaired. Usability analysis has been performed to check whether these websites are suitable for persons with visual impairment. We provide some valuable suggestions which can be followed by developers and designers to minimize these potential accessibility barriers.Based on aforementioned key points, this article helps the persons with disability especially Visually Impaired Users to access the web resources better with the implementation of identified suggestions.

  15. Data Visualization Using Immersive Virtual Reality Tools

    NASA Astrophysics Data System (ADS)

    Cioc, Alexandru; Djorgovski, S. G.; Donalek, C.; Lawler, E.; Sauer, F.; Longo, G.

    2013-01-01

    The growing complexity of scientific data poses serious challenges for an effective visualization. Data sets, e.g., catalogs of objects detected in sky surveys, can have a very high dimensionality, ~ 100 - 1000. Visualizing such hyper-dimensional data parameter spaces is essentially impossible, but there are ways of visualizing up to ~ 10 dimensions in a pseudo-3D display. We have been experimenting with the emerging technologies of immersive virtual reality (VR) as a platform for a scientific, interactive, collaborative data visualization. Our initial experiments used the virtual world of Second Life, and more recently VR worlds based on its open source code, OpenSimulator. There we can visualize up to ~ 100,000 data points in ~ 7 - 8 dimensions (3 spatial and others encoded as shapes, colors, sizes, etc.), in an immersive virtual space where scientists can interact with their data and with each other. We are now developing a more scalable visualization environment using the popular (practically an emerging standard) Unity 3D Game Engine, coded using C#, JavaScript, and the Unity Scripting Language. This visualization tool can be used through a standard web browser, or a standalone browser of its own. Rather than merely plotting data points, the application creates interactive three-dimensional objects of various shapes, colors, and sizes, and of course the XYZ positions, encoding various dimensions of the parameter space, that can be associated interactively. Multiple users can navigate through this data space simultaneously, either with their own, independent vantage points, or with a shared view. At this stage ~ 100,000 data points can be easily visualized within seconds on a simple laptop. The displayed data points can contain linked information; e.g., upon a clicking on a data point, a webpage with additional information can be rendered within the 3D world. A range of functionalities has been already deployed, and more are being added. We expect to make this visualization tool freely available to the academic community within a few months, on an experimental (beta testing) basis.

  16. Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic

    NASA Technical Reports Server (NTRS)

    Hjermstad, Chris

    1986-01-01

    Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.

  17. Training Plan (M29 Revision)

    ERIC Educational Resources Information Center

    Online Submission, 2008

    2008-01-01

    The objectives of training activities as stated in the DoW are: 1. to organize training within the project for the participants to learn to use the necessary tools; 2. to support training activities of the partners when they deliver and take in use the tools and practices during extended pilots. The approach with regards to the first point is that…

  18. Gnostic rings: usefulness in sensibility evaluation and sensory reeducation.

    PubMed

    Brunelli, G; Battiston, B; Dellon, A L

    1992-01-01

    The benefit of additional clinical tools for quantifying patients' ability to recognize objects is clear, as well as its correlation with the moving two-point discrimination test. The recognition of letters is such a tool. The authors describe gnostic rings, an additional technique, that is useful for clinical sensibility testing, as well as for sensory reeducation.

  19. Use of an Integrated Pest Management Assessment Administered through Turningpoint as an Educational, Needs Assessment, and Evaluation Tool

    ERIC Educational Resources Information Center

    Stahl, Lizabeth A. B.; Behnken, Lisa M.; Breitenbach, Fritz R.; Miller, Ryan P.; Nicolai, David; Gunsolus, Jeffrey L.

    2016-01-01

    University of Minnesota educators use an integrated pest management (IPM) survey conducted during private pesticide applicator training as an educational, needs assessment, and evaluation tool. By incorporating the IPM Assessment, as the survey is called, into a widely attended program and using TurningPoint audience response devices, Extension…

  20. UNIX as an environment for producing numerical software

    NASA Technical Reports Server (NTRS)

    Schryer, N. L.

    1978-01-01

    The UNIX operating system supports a number of software tools; a mathematical equation-setting language, a phototypesetting language, a FORTRAN preprocessor language, a text editor, and a command interpreter. The design, implementation, documentation, and maintenance of a portable FORTRAN test of the floating-point arithmetic unit of a computer is used to illustrate these tools at work.

  1. Study and Research Courses as an Epistemological Model for Didactics

    ERIC Educational Resources Information Center

    Winslow, Carl; Matheron, Yves; Mercier, Alain

    2013-01-01

    The aim of this paper is explain how the notion of "study and research course", a recent construct in the anthropological theory of didactics, provides a general tool to model mathematical knowledge from a didactical perspective. We do this from two points of view. First, the notion itself arose as a tool for didactic design, particularly in…

  2. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Treesearch

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  3. Integrating Technology in Today's Undergraduate Classrooms: A Look at Students' Perspectives

    ERIC Educational Resources Information Center

    Meehan, Kimberly C.; Salmun, Haydee

    2016-01-01

    The authors present the findings of a small-scale study of student opinions drawn from an anonymous and voluntary survey in an undergraduate science classroom. The survey questions focused on the use of basic tools in a college classroom. The tools included in the survey were PowerPoint, overhead projectors/chalkboards, personal response units,…

  4. Design-Thinking, Making, and Innovating: Fresh Tools for the Physician's Toolbox

    ERIC Educational Resources Information Center

    Albala, L.; Bober, T.; Mallozzi, M.; Koeneke-Hernandez, L.; Ku, B.

    2018-01-01

    Medical school education should foster creativity by enabling students to become "makers" who prototype and design. Healthcare professionals and students experience pain points on a daily basis, but are not given the tools, training, or opportunity to help solve them in new, potentially better ways. The student physician of the future…

  5. The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View

    ERIC Educational Resources Information Center

    Rompelman, Otto; De Graaff, Erik

    2006-01-01

    Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…

  6. Exercise Black Skies 2008: Enhancing Live Training Through Virtual Preparation -- Part Two: An Evaluation of Tools and Techniques

    DTIC Science & Technology

    2009-06-01

    visualisation tool. These tools are currently in use at the Surveillance and Control Training Unit (SACTU) in Williamtown, New South Wales, and the School...itself by facilitating the brevity and sharpness of learning points. The playback of video and audio was considered an extremely useful method of...The task assessor’s comments were supported by wall projections and audio replays of relevant mission segments that were controlled by an AAR

  7. TMDlib and TMDplotter: library and plotting tools for transverse-momentum-dependent parton distributions

    NASA Astrophysics Data System (ADS)

    Hautmann, F.; Jung, H.; Krämer, M.; Mulders, P. J.; Nocera, E. R.; Rogers, T. C.; Signori, A.

    2014-12-01

    Transverse-momentum-dependent distributions (TMDs) are extensions of collinear parton distributions and are important in high-energy physics from both theoretical and phenomenological points of view. In this manual we introduce the library , a tool to collect transverse-momentum-dependent parton distribution functions (TMD PDFs) and fragmentation functions (TMD FFs) together with an online plotting tool, TMDplotter. We provide a description of the program components and of the different physical frameworks the user can access via the available parameterisations.

  8. TMDlib and TMDplotter: library and plotting tools for transverse-momentum-dependent parton distributions.

    PubMed

    Hautmann, F; Jung, H; Krämer, M; Mulders, P J; Nocera, E R; Rogers, T C; Signori, A

    Transverse-momentum-dependent distributions (TMDs) are extensions of collinear parton distributions and are important in high-energy physics from both theoretical and phenomenological points of view. In this manual we introduce the library [Formula: see text], a tool to collect transverse-momentum-dependent parton distribution functions (TMD PDFs) and fragmentation functions (TMD FFs) together with an online plotting tool, TMDplotter. We provide a description of the program components and of the different physical frameworks the user can access via the available parameterisations.

  9. Computational fluid dynamics: An engineering tool?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D., Jr.

    1982-06-01

    Computational fluid dynamics in general, and time dependent finite difference techniques in particular, are examined from the point of view of direct engineering applications. Examples are given of the supersonic blunt body problem and gasdynamic laser calculations, where such techniques are clearly engineering tools. In addition, Navier-Stokes calculations of chemical laser flows are discussed as an example of a near engineering tool. Finally, calculations of the flowfield in a reciprocating internal combustion engine are offered as a promising future engineering application of computational fluid dynamics.

  10. Gray scale x-ray mask

    DOEpatents

    Morales, Alfredo M [Livermore, CA; Gonzales, Marcela [Seattle, WA

    2006-03-07

    The present invention describes a method for fabricating an embossing tool or an x-ray mask tool, providing microstructures that smoothly vary in height from point-to-point in etched substrates, i.e., structure which can vary in all three dimensions. The process uses a lithographic technique to transfer an image pattern in the surface of a silicon wafer by exposing and developing the resist and then etching the silicon substrate. Importantly, the photoresist is variably exposed so that when developed some of the resist layer remains. The remaining undeveloped resist acts as an etchant barrier to the reactive plasma used to etch the silicon substrate and therefore provides the ability etch structures of variable depths.

  11. Three tooth kinematic coupling

    DOEpatents

    Hale, Layton C.

    2000-01-01

    A three tooth kinematic coupling based on having three theoretical line contacts formed by mating teeth rather than six theoretical point contacts. The geometry requires one coupling half to have curved teeth and the other coupling half to have flat teeth. Each coupling half has a relieved center portion which does not effect the kinematics, but in the limit as the face width approaches zero, three line contacts become six point contacts. As a result of having line contact, a three tooth coupling has greater load capacity and stiffness. The kinematic coupling has application for use in precision fixturing for tools or workpieces, and as a registration device for a work or tool changer or for optics in various products.

  12. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  13. Big Data and Neuroimaging.

    PubMed

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  14. Precision tool holder with flexure-adjustable, three degrees of freedom for a four-axis lathe

    DOEpatents

    Bono, Matthew J [Pleasanton, CA; Hibbard, Robin L [Livermore, CA

    2008-03-04

    A precision tool holder for precisely positioning a single point cutting tool on 4-axis lathe, such that the center of the radius of the tool nose is aligned with the B-axis of the machine tool, so as to facilitate the machining of precision meso-scale components with complex three-dimensional shapes with sub-.mu.m accuracy on a four-axis lathe. The device is designed to fit on a commercial diamond turning machine and can adjust the cutting tool position in three orthogonal directions with sub-micrometer resolution. In particular, the tool holder adjusts the tool position using three flexure-based mechanisms, with two flexure mechanisms adjusting the lateral position of the tool to align the tool with the B-axis, and a third flexure mechanism adjusting the height of the tool. Preferably, the flexures are driven by manual micrometer adjusters. In this manner, this tool holder simplifies the process of setting a tool with sub-.mu.m accuracy, to substantially reduce the time required to set the tool.

  15. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Barbak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a next generation CED; nin addition to a point design, the team develops a model of the local trade space. The process is a balance between the power of model-developing tools and the creativity of human experts, enabling the development of a variety of trade models for any space mission.

  16. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  17. Reliability and validity of the Dutch pediatric Voice Handicap Index.

    PubMed

    Veder, Laura; Pullens, Bas; Timmerman, Marieke; Hoeve, Hans; Joosten, Koen; Hakkesteegt, Marieke

    2017-05-01

    The pediatric voice handicap index (pVHI) has been developed to provide a better insight into the parents' perception of their child's voice related quality of life. The purpose of the present study was to validate the Dutch pVHI by evaluating its internal consistency and reliability. Furthermore, we determined the optimal cut-off point for a normal pVHI score. All items of the English pVHI were translated into Dutch. Parents of children in our dysphonic and control group were asked to fill out the questionnaire. For the test re-test analysis we used a different study group who filled out the pVHI twice as part of a large follow up study. Internal consistency was analyzed through Cronbach's α coefficient. The test-retest reliability was assessed by determining Pearson's correlation coefficient. Mann-Whitney test was used to compare the scores of the questionnaire of the control group with the dysphonic group. By calculating receiver operating characteristic (ROC) curves, sensitivity and specificity we were able to set a cut-off point. We obtained data from 122 asymptomatic children and from 79 dysphonic children. The scores of the questionnaire significantly differed between both groups. The internal consistency showed an overall Cronbach α coefficient of 0.96 and an excellent test-retest reliability of the total pVHI questionnaire with a Pearson's correlation coefficient of 0.90. A cut-off point for the total pVHI questionnaire was set at 7 points with a specificity of 85% and sensitivity of 100%. A cut-off point for the VAS score was set at 13 with a specificity of 93% and sensitivity of 97%. The Dutch pVHI is a valid and reliable tool for the assessment of children with voice problems. By setting a cut-off point for the score of the total pVHI questionnaire of 7 points and the VAS score of 13, the pVHI might be used as a screening tool to assess dysphonic complaints and the pVHI might be a useful and complementary tool to identify children with dysphonia. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Acceptability and potential effectiveness of commercial portion control tools amongst people with obesity.

    PubMed

    Almiron-Roig, Eva; Domínguez, Angélica; Vaughan, David; Solis-Trapala, Ivonne; Jebb, Susan A

    2016-12-01

    Exposure to large portion sizes is a risk factor for obesity. Specifically designed tableware may modulate how much is eaten and help with portion control. We examined the experience of using a guided crockery set (CS) and a calibrated serving spoon set (SS) by individuals trying to manage their weight. Twenty-nine obese adults who had completed 7-12 weeks of a community weight-loss programme were invited to use both tools for 2 weeks each, in a crossover design, with minimal health professional contact. A paper-based questionnaire was used to collect data on acceptance, perceived changes in portion size, frequency, and type of meal when the tool was used. Scores describing acceptance, ease of use and perceived effectiveness were derived from five-point Likert scales from which binary indicators (high/low) were analysed using logistic regression. Mean acceptance, ease of use and perceived effectiveness were moderate to high (3·7-4·4 points). Tool type did not have an impact on indicators of acceptance, ease of use and perceived effectiveness (P>0·32 for all comparisons); 55 % of participants used the CS on most days v. 21 % for the SS. The CS was used for all meals, whereas the SS was mostly used for evening meals. Self-selected portion sizes increased for vegetables and decreased for chips and potatoes with both tools. Participants rated both tools as equally acceptable, easy to use and with similar perceived effectiveness. Formal trials to evaluate the impact of such tools on weight control are warranted.

  19. Redesigning Task Sequences to Support Instrumental Genesis in the Use of Movable Points and Slider Bars

    ERIC Educational Resources Information Center

    Fahlgren, Maria

    2017-01-01

    This paper examines the process of instrumental genesis through which students develop their proficiency in making use of movable points and slider bars--two tools that dynamic mathematics software provides for working with variable coordinates and parameters in the field of functions. The paper analyses students' responses to task sequences…

  20. Distinguishing the Effects of Local Point Sources from Those Caused by Upstream Nonpoint Source (NPS) Inputs: Refinement of a Watershed Development Index for New England

    EPA Science Inventory

    Using EMAP data from the NE Wadeable Stream Survey and state datasets (CT, ME), assessment tools were developed to predict diffuse NPS effects from watershed development and distinguish these from local impacts (point sources, contaminated sediments). Classification schemes were...

  1. ClinicalKey: a point-of-care search engine.

    PubMed

    Vardell, Emily

    2013-01-01

    ClinicalKey is a new point-of-care resource for health care professionals. Through controlled vocabulary, ClinicalKey offers a cross section of resources on diseases and procedures, from journals to e-books and practice guidelines to patient education. A sample search was conducted to demonstrate the features of the database, and a comparison with similar tools is presented.

  2. Simple PowerPoint Animation

    NASA Astrophysics Data System (ADS)

    Takahashi, Leo

    2011-03-01

    The use of animation as a teaching tool has long been of interest to the readers of and contributors to this journal.1-5 While the sophisticated techniques presented in the cited papers are excellent and useful, there is one overlooked technique that may be of interest to the teacher who wants something quick and simple to enhance classroom presentations: PowerPoint animation.

  3. Pointing Gestures as a Cognitive Tool in Young Children: Experimental Evidence

    ERIC Educational Resources Information Center

    Delgado, Begona; Gomez, Juan Carlos; Sarria, Encarnacion

    2011-01-01

    This article explores the possible cognitive function associated with pointing gestures from a Vygotskian perspective. In Study 1, 39 children who were 2-4 years of age were observed in a solitary condition while solving a mnemonic task with or without an explicit memory demand. A discriminant analysis showed that children used noncommunicative…

  4. Mobile Cubesat Command and Control (Mc3) 3-Meter Dish Calibration and Capabilities

    DTIC Science & Technology

    2014-06-01

    accuracy of this simple calibration is tested by tracking the sun, an easily accessible celestial body. To track the sun, a Systems Tool Kit ( STK ... visually verified. The shadow created by the dish system when it is pointed directly at the sun is symmetrical. If the dish system is not pointed

  5. Distinguishing Betwen Effects of Local Inputs (Contaminated Sediments, Point Sources) and Upstream Diffuse Nonpoint Source Input: Refinement of a Watershed Development Index for New England

    EPA Science Inventory

    Assessment tools are being developed to predict diffuse NPS effects from watershed development and distinguish these from local impacts (point sources, contaminated sediments). Using EMAP data from the New England Wadeable Stream Survey and two state datasets (CT, ME), we are de...

  6. Using Modern Digital Photography Tools to Guide Management Decisions on Forested Land

    ERIC Educational Resources Information Center

    Craft, Brandon; Barlow, Rebecca; Kush, John; Hemard, Charles

    2016-01-01

    Forestland management depends on assessing changes that occur over time. Long-term photo point monitoring is a low-cost method for documenting these changes. Using forestry as an example, this article highlights the idea that long-term photo point monitoring can be used to improve many types of land management decision making. Guidance on…

  7. Using Prezi in Higher Education

    ERIC Educational Resources Information Center

    Strasser, Nora

    2014-01-01

    PowerPoint can be viewed as boring and commonplace (Craig & Amernic, 2006). While it is a great tool, using a more dynamic presentation editor may better capture the attention of a class or any other group of people. Having an editor that is cloud-based allows for more flexibility and collaboration than is possible with PowerPoint (Settle,…

  8. MS Power Point vs Prezi in Higher Education

    ERIC Educational Resources Information Center

    Kiss, Gabor

    2016-01-01

    The teachers use different presentation tools in Higher Education to make the presentation enjoyable for the students. I used MS Power Point or Prezi in my presentations in two different groups of the freshmen students at the University. The aim of this research was an analysis of the paper results in two groups of students to reveal the influence…

  9. Aqui y Alla en California. (Here and There in California).

    ERIC Educational Resources Information Center

    Galarza, Ernesto

    One in the series "Coleccion Mini-Libros" (Mini-Book Collection) written in Spanish as an enrichment tool for the Spanish speaker, the booklet is a compilation of photographs accompanied by brief descriptions of various points of beauty and interest throughout the State of California. Among the points of interest described are La Sierra Nevada,…

  10. A 16th Suggestions for Educational Curriculum Improvement in Jordan, from the Experts Point of View

    ERIC Educational Resources Information Center

    Mahasneh, Omar

    2015-01-01

    The present research was conducted to identify the most important suggestions for educational curriculum improvement in Jordan, from the expert's point of view. A descriptive survey through data and information collection tool (questionnaire) was used as an approach. The study sample consisted of (620) educational experts in the field of…

  11. SU-E-T-211: Comparison of Seven New TrueBeam Linacs with Enhanced Beam Data Conformance Using a Beam Comparison Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grzetic, S; Hessler, J; Gupta, N

    2015-06-15

    Purpose: To develop an independent software tool to assist in commissioning linacs with enhanced beam conformance, as well as perform ongoing QA for dosimetrically equivalent linacs. Methods: Linac manufacturers offer enhanced beam conformance as an option to allow for clinics to complete commissioning efficiently, as well as implement dosimetrically equivalent linacs. The specification for enhanced conformance includes PDD as well as profiles within 80% FWHM. Recently, we commissioned seven Varian TrueBeam linacs with enhanced beam conformance. We developed a software tool in Visual Basic to allow us to load the reference beam data and compare our beam data during commissioningmore » to evaluate enhanced beam conformance. This tool also allowed us to upload our beam data used for commissioning our dosimetrically equivalent beam models to compare and tweak each of our linac beams to match our modelled data in Varian’s Eclipse TPS. This tool will also be used during annual QA of the linacs to compare our beam data to our baseline data, as required by TG-142. Results: Our software tool was used to check beam conformance for seven TrueBeam linacs that we commissioned in the past six months. Using our tool we found that the factory conformed linacs showed up to 3.82% difference in their beam profile data upon installation. Using our beam comparison tool, we were able to adjust the energy and profiles of our beams to accomplish a better than 1.00% point by point data conformance. Conclusion: The availability of quantitative comparison tools is essential to accept and commission linacs with enhanced beam conformance, as well as to beam match multiple linacs. We further intend to use the same tool to ensure our beam data conforms to the commissioning beam data during our annual QA in keeping with the requirements of TG-142.« less

  12. MedAd-AppQ: A quality assessment tool for medication adherence apps on iOS and android platforms.

    PubMed

    Ali, Eskinder Eshetu; Teo, Amanda Kai Sin; Goh, Sherlyn Xue Lin; Chew, Lita; Yap, Kevin Yi-Lwern

    2018-02-02

    With the recent proliferation of smartphone medication adherence applications (apps), it is increasingly more difficult for patients and clinicians to identify the most useful app. To develop a quality assessment tool for medication adherence apps, and evaluate the quality of such apps from the major app stores. In this study, a Medication Adherence App Quality assessment tool (MedAd-AppQ) was developed and two evaluators independently assessed apps that fulfilled the following criteria: availability in English, had at least a medication reminder feature, non-specific to certain disease conditions (generic apps), free of technical malfunctions and availability on both the iPhone Operating System (iOS) and Android platforms. Descriptive statistics, Mann-Whitney U test, Pearson product moment correlation and Spearman rank-order correlation were used for statistical analysis. MedAd-AppQ was designed to have 24 items (total 43 points) categorized under three sections: content reliability (11 points), feature usefulness (29 points) and feature convenience (3 points). The three sections of MedAd-AppQ were found to have inter-rater correlation coefficients of 0.801 (p-value < .001) or higher. Based on analysis of 52 apps (27 iOS and 25 Android), quality scores ranged between 7/43 (16.3%) and 28/43 (65.1%). There was no significant difference between the quality scores of the Android and iOS versions. None of the apps had features for self-management of side effects. Only two apps in each platform provided disease-related and/or medication information. MedAd-AppQ can be used to reliably assess the quality of adherence apps. Clinicians can use the tool in selecting apps for use by patients. Developers of adherence apps should consider features that provide therapy-related information and help patients in medications and side-effects management. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Proposal of an environmental performance index to assess solid waste treatment technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goulart Coelho, Hosmanny Mauro, E-mail: hosmanny@hotmail.com; Lange, Lisete Celina; Coelho, Lineker Max Goulart

    2012-07-15

    Highlights: Black-Right-Pointing-Pointer Proposal of a new concept in waste management: Cleaner Treatment. Black-Right-Pointing-Pointer Development of an index to assess quantitatively waste treatment technologies. Black-Right-Pointing-Pointer Delphi Method was carried out so as to define environmental indicators. Black-Right-Pointing-Pointer Environmental performance evaluation of waste-to-energy plants. - Abstract: Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond wastemore » energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation.« less

  14. Spectrum of classes of point emitters of electromagnetic wave fields.

    PubMed

    Castañeda, Román

    2016-09-01

    The spectrum of classes of point emitters has been introduced as a numerical tool suitable for the design, analysis, and synthesis of non-paraxial optical fields in arbitrary states of spatial coherence. In this paper, the polarization state of planar electromagnetic wave fields is included in the spectrum of classes, thus increasing its modeling capabilities. In this context, optical processing is realized as a filtering on the spectrum of classes of point emitters, performed by the complex degree of spatial coherence and the two-point correlation of polarization, which could be implemented dynamically by using programmable optical devices.

  15. The Effects of Using Microsoft Power Point on EFL Learners' Attitude and Anxiety: Case Study of Two Master Students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes, Algeria

    ERIC Educational Resources Information Center

    Benghalem, Boualem

    2015-01-01

    This study aims to investigate the effects of using ICT tools such as Microsoft PowerPoint on EFL students' attitude and anxiety. The participants in this study were 40 Master 2 students of Didactics of English as a Foreign Language, Djillali Liabes University, Sidi Bel Abbes Algeria. In order to find out the effects of Microsoft PowerPoint on EFL…

  16. Analysis and Comparison of Various Requirements Management Tools for Use in the Shipbuilding Industry

    DTIC Science & Technology

    2006-09-01

    such products as MS Word, MS Excel, MS PowerPoint, Adobe Acrobat, Adobe FrameMaker , Claris FileMaker, Adobe PhotoShop and Adobe Illustrator, it is easy...Adobe FrameMaker , etc. Information can be exported out in the same formats as above plus HTML, MS PowerPoint, and MS Outlook. DOORS is very user...including Postscript, RTF (for PowerPoint), HTML, Interleaf, SVG, FrameMaker , HP LaserJet, HPGL, and EPS. Examples of such charts produced by DOORS

  17. Catalog of Existing Small Tools for Surface Preparation and Support Equipment for Blasters and Painters

    DTIC Science & Technology

    1977-05-01

    128 lbs./ft3 Specific Gravity 3.6 Hardness (MOHS) 7 Melting Point 2900°F. Coefficient of Expansion 7 . 8 X 1 0– 6 FIGURE 3.17: Properties of...Beaumont, Texas Bethlehem Steel Corporation, San Francisco, California Bethlehem Steel Corporation, Sparrows Point , Maryland Jacksonville Shipyards...checklist can be used by operators and super- visors as a starting point for determining if the yard’s abrasive blasting facility is operating at full

  18. Development and Integration of Control System Models

    NASA Technical Reports Server (NTRS)

    Kim, Young K.

    1998-01-01

    The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.

  19. Users Guide for Fire Image Analysis System - Version 5.0: A Tool for Measuring Fire Behavior Characteristics

    Treesearch

    Carl W. Adkins

    1995-01-01

    The Fire Image Analysis System is a tool for quantifying flame geometry and relative position at selected points along a spreading line fire. At present, the system requires uniform terrain (constant slope). The system has been used in field and laboratory studies for determining flame length, depth, cross sectional area, and rate of spread.

  20. Role-Playing Games for Capacity Building in Water and Land Management: Some Brazilian Experiences

    ERIC Educational Resources Information Center

    Camargo, Maria Eugenia; Jacobi, Pedro Roberto; Ducrot, Raphaele

    2007-01-01

    Role-playing games in natural resource management are currently being tested as research, training, and intervention tools all over the world. Various studies point out their potential to deal with complex issues and to contribute to training processes. The objective of this contribution is to analyze the limits and potentialities of this tool for…

  1. Homemade Powerpoint Games: Game Design Pedagogy Aligned to the TPACK Framework

    ERIC Educational Resources Information Center

    Siko, Jason P.; Barbour, Michael K.

    2012-01-01

    While researchers are examining the role of playing games to learn, others are looking at using game design as an instructional tool. However, game-design software may require additional time to train both teachers and students. In this article, the authors discuss the use of Microsoft PowerPoint as a tool for game-design instruction and the…

  2. Tools for the trade

    NASA Technical Reports Server (NTRS)

    Gillman, Wallace M.

    1990-01-01

    A brief review is given of daily operations in the airline business, with emphasis on the decisions made by pilots and the information used to make those decisions. Various wind shears are discussed as they affect daily operations. The discussion of tools focuses on airborne reactive and predictive systems. The escape maneuver used to fly out of a severe windshear is from a pilot's point of view.

  3. Tool Use and the Development of the Function Concept: From Repeated Calculations to Functional Thinking

    ERIC Educational Resources Information Center

    Doorman, Michiel; Drijvers, Paul; Gravemeijer, Koeno; Boon, Peter; Reed, Helen

    2012-01-01

    The concept of function is a central but difficult topic in secondary school mathematics curricula, which encompasses a transition from an operational to a structural view. The question in this paper is how the use of computer tools may foster this transition. With domain-specific pedagogical knowledge on the learning of function as a point of…

  4. Evaluating analytic and risk assessment tools to estimate sediment and nutrients losses from agricultural lands in the southern region of the USA

    USDA-ARS?s Scientific Manuscript database

    Non-point source pollution from agricultural fields is a critical problem associated with water quality impairment in the USA and a low-oxygen environment in the Gulf of Mexico. The use, development and enhancement of qualitative and quantitative models or tools for assessing agricultural runoff qua...

  5. Perceptions and Use of Learning Management System Tools and Other Technologies in Higher Education: A Preliminary Analysis

    ERIC Educational Resources Information Center

    Borboa, Danielle; Joseph, Mathew; Spake, Deborah; Yazdanparast, Atefeh

    2017-01-01

    This study examines student views and use of technology in conjunction with university coursework. Results reveal that there is widespread use of Microsoft PowerPoint and certain learning management system (LMS) features; however, there are significant differences in views concerning the degree to which these LMS tools enhance learning based on…

  6. "In silico" mechanistic studies as predictive tools in microwave-assisted organic synthesis.

    PubMed

    Rodriguez, A M; Prieto, P; de la Hoz, A; Díaz-Ortiz, A

    2011-04-07

    Computational calculations can be used as a predictive tool in Microwave-Assisted Organic Synthesis (MAOS). A DFT study on Intramolecular Diels-Alder reactions (IMDA) indicated that the activation energy of the reaction and the polarity of the stationary points are two fundamental parameters to determine "a priori" if a reaction can be improved by using microwave irradiation.

  7. "Tell Me a Story": The Use of Narrative as a Learning Tool for Natural Selection

    ERIC Educational Resources Information Center

    Prins, Renate; Avraamidou, Lucy; Goedhart, Martin

    2017-01-01

    Grounded within literature pointing to the value of narrative in communicating scientific information, the purpose of this study was to examine the use of stories as a tool for teaching about natural selection in the context of school science. The study utilizes a mixed method, case study approach which focuses on the design, implementation, and…

  8. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    PubMed

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  9. Using GIS to analyze animal movements in the marine environment

    USGS Publications Warehouse

    Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.

    2001-01-01

    Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.

  10. Assessment of diabetic polyneuropathy in Zanzibar: Comparison between traditional methods and an automated point-of-care nerve conduction device.

    PubMed

    Vogt, Elinor C; Øksnes, Marianne; Suleiman, Faiza; Juma, Buthayna Ali; Thordarson, Hrafnkell B; Ommedal, Ola; Søfteland, Eirik

    2017-12-01

    Scant information is available about the prevalence of diabetic polyneuropathy, as well as the applicability of screening tools in sub-Saharan Africa. We aimed to investigate these issues in Zanzibar (Tanzania). One hundred consecutive diabetes patients were included from the diabetes clinic at Mnazi Mmoja Hospital. Clinical characteristics were recorded. Further, we investigated: a) self-reported numbness of the lower limbs, b) ten-point monofilament test, c) the Sibbald 60-s Tool and d) nerve conduction studies (NCS, using an automated handheld point-of-care device, the NC-stat DPNCheck). Mean age was 54 years, 90% had type 2 diabetes, and with 9 year average disease duration. Mean HbA1c was 8.5% (69 mmol/mol), blood pressure 155/88 mmHg. Sixty-two% reported numbness, 61% had positive monofilament and 79% positive Sibbald tool. NCS defined neuropathy in 45% of the patients. Only the monofilament showed appreciable concordance with the NCS, Cohen's κ 0.43. The patient population was characterised by poor glycaemic control and hypertension. In line with this, neuropathy was rampant. The monofilament test tended to define more cases of probable neuropathy than the NCS, however specificity was rather low. Plantar skin thickening may have led to false positives in this population. Overall concordance was, however, appreciable, and could support continued use of monofilament as a neuropathy screening tool. The NC-stat DPNCheck could be useful in cases of diagnostic uncertainty or for research purposes in a low resource setting.

  11. Laser production of articles from powders

    DOEpatents

    Lewis, Gary K.; Milewski, John O.; Cremers, David A.; Nemec, Ronald B.; Barbe, Michael R.

    1998-01-01

    Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path.

  12. Laser production of articles from powders

    DOEpatents

    Lewis, G.K.; Milewski, J.O.; Cremers, D.A.; Nemec, R.B.; Barbe, M.R.

    1998-11-17

    Method and apparatus for forming articles from materials in particulate form in which the materials are melted by a laser beam and deposited at points along a tool path to form an article of the desired shape and dimensions. Preferably the tool path and other parameters of the deposition process are established using computer-aided design and manufacturing techniques. A controller comprised of a digital computer directs movement of a deposition zone along the tool path and provides control signals to adjust apparatus functions, such as the speed at which a deposition head which delivers the laser beam and powder to the deposition zone moves along the tool path. 20 figs.

  13. HEXPANDO Expanding Head for Fastener-Retention Hexagonal Wrench

    NASA Technical Reports Server (NTRS)

    Bishop, John

    2011-01-01

    The HEXPANDO is an expanding-head hexagonal wrench designed to retain fasteners and keep them from being dislodged from the tool. The tool is intended to remove or install socket-head cap screws (SHCSs) in remote, hard-to-reach locations or in circumstances when a dropped fastener could cause damage to delicate or sensitive hardware. It is not intended for application of torque. This tool is made of two assembled portions. The first portion of the tool comprises tubing, or a hollow shaft, at a length that gives the user adequate reach to the intended location. At one end of the tubing is the expanding hexagonal head fitting with six radial slits cut into it (one at each of the points of the hexagonal shape), and a small hole drilled axially through the center and the end opposite the hex is internally and externally threaded. This fitting is threaded into the shaft (via external threads) and staked or bonded so that it will not loosen. At the other end of the tubing is a knurled collar with a through hole into which the tubing is threaded. This knob is secured in place by a stop nut. The second assembled portion of the tool comprises a length of all thread or solid rod that is slightly longer than the steel tubing. One end has a slightly larger knurled collar affixed while the other end is tapered/pointed and threaded. When the two portions are assembled, the all thread/rod portion feeds through the tubing and is threaded into the expanding hex head fitting. The tapered point allows it to be driven into the through hole of the hex fitting. While holding the smaller collar on the shaft, the user turns the larger collar, and as the threads feed into the fitting, the hex head expands and grips the SHCS, thus providing a safe way to install and remove fasteners. The clamping force retaining the SHCS varies depending on how far the tapered end is inserted into the tool head. Initial tests of the prototype tool, designed for a 5 mm or # 10SHCS have resulted in up to 8 lb (.35.6 N) of pull force to dislodge the SHCS from the tool. The tool is designed with a lead-in angle from the diameter of the tubing to a diameter the same as the fastener head, to prevent the fastener head from catching on any obstructions encountered that could dislodge the fastener during retrieval.

  14. Electronic health record interventions at the point of care improve documentation of care processes and decrease orders for genetic tests commonly ordered by nongeneticists.

    PubMed

    Scheuner, Maren T; Peredo, Jane; Tangney, Kelly; Schoeff, Diane; Sale, Taylor; Lubick-Goldzweig, Caroline; Hamilton, Alison; Hilborne, Lee; Lee, Martin; Mittman, Brian; Yano, Elizabeth M; Lubin, Ira M

    2017-01-01

    To determine whether electronic health record (EHR) tools improve documentation of pre- and postanalytic care processes for genetic tests ordered by nongeneticists. We conducted a nonrandomized, controlled, pre-/postintervention study of EHR point-of-care tools (informational messages and template report) for three genetic tests. Chart review assessed documentation of genetic testing processes of care, with points assigned for each documented item. Multiple linear and logistic regressions assessed factors associated with documentation. Preimplementation, there were no significant site differences (P > 0.05). Postimplementation, mean documentation scores increased (5.9 (2.1) vs. 5.0 (2.2); P = 0.0001) and records with clinically meaningful documentation increased (score >5: 59 vs. 47%; P = 0.02) at the intervention versus the control site. Pre- and postimplementation, a score >5 was positively associated with abnormal test results (OR = 4.0; 95% CI: 1.8-9.2) and trainee provider (OR = 2.3; 95% CI: 1.2-4.6). Postimplementation, a score >5 was also positively associated with intervention site (OR = 2.3; 95% CI: 1.1-5.1) and specialty clinic (OR = 2.0; 95% CI: 1.1-3.6). There were also significantly fewer tests ordered after implementation (264/100,000 vs. 204/100,000; P = 0.03), with no significant change at the control site (280/100,000 vs. 257/100,000; P = 0.50). EHR point-of-care tools improved documentation of genetic testing processes and decreased utilization of genetic tests commonly ordered by nongeneticists.Genet Med 19 1, 112-120.

  15. [Validation of the Montgomery-Åsberg Depression Rating Scale (MADRS) in Colombia].

    PubMed

    Cano, Juan Fernando; Gomez Restrepo, Carlos; Rondón, Martín

    2016-01-01

    To adapt and to validate the Montgomery-Åsberg Depression Rating Scale (MADRS) in Colombia. Observational study for scale validation. Validity criteria were used to determine the severity cut-off points of the tool. Taking into account sensitivity and specificity values, those cut points were contrasted with ICD-10 criteria for depression severity. A a factor analysis was performed. The internal consistencY was determined with the same sample of patients used for the validity criteria. Inter-rater reliability was assessed by evaluating the 22 records of the patients that consented to a video interview. Sensitivity to change was established through a second application of the scale in 28 subjects after a lapse of 14 to 28 days. The study was performed in Bogotá, the tool was applied in 150 patients suffering from major depressive disorder. The cut-off point for moderate depression was 20 (sensitivity, 98%; specificity, 96%), and the cut-off point for severe depression was 34 (sensitivity, 98%; specificity, 92%). The tool appears as a unidimensional scale, which possesses a good internal consistency with (α=.9168). The findings of inter-rater reliability evaluation showed the scale as highly reliable (intraclass correlation coefficient=.9833). The instrument has a good sensitivity to change. The Colombian version of the Montgomery-Åsberg Depression Rating Scale has good psychometric properties and can be used in clinical practice and in clinical research in the field of depressive disorder. Copyright © 2015 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  16. New Tooling System for Forming Aluminum Beverage Can End Shell

    NASA Astrophysics Data System (ADS)

    Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo

    2011-08-01

    This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.

  17. Rotary fast tool servo system and methods

    DOEpatents

    Montesanti, Richard C.; Trumper, David L.

    2007-10-02

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  18. Rotary fast tool servo system and methods

    DOEpatents

    Montesanti, Richard C [Cambridge, MA; Trumper, David L [Plaistow, NH; Kirtley, Jr., James L.

    2009-08-18

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. One or more position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  19. Old model organisms and new behavioral end-points: Swimming alteration as an ecotoxicological response.

    PubMed

    Faimali, Marco; Gambardella, Chiara; Costa, Elisa; Piazza, Veronica; Morgana, Silvia; Estévez-Calvar, Noelia; Garaventa, Francesca

    2017-07-01

    Behavioral responses of aquatic organisms have received much less attention than developmental or reproductive ones due to the scarce presence of user-friendly tools for their acquisition. The technological development of data acquisition systems for quantifying behavior in the aquatic environment and the increase of studies on the understanding the relationship between the behavior of aquatic organisms and the physiological/ecological activities have generated renewed interest in using behavioral responses also in marine ecotoxicology. Recent reviews on freshwater environment show that behavioral end-points are comparatively fast and sensitive, and warrant further attention as tools for assessing the toxicological effects of environmental contaminants. In this mini-review, we perform a systematic analysis of the most recent works that have used marine invertebrate swimming alteration as behavioral end-point in ecotoxicological studies by assessing the differences between behavioral and acute responses in a wide range of species, in order to compare their sensitivity. Copyright © 2016. Published by Elsevier Ltd.

  20. Enhancing efficiency and quality of statistical estimation of immunogenicity assay cut points through standardization and automation.

    PubMed

    Su, Cheng; Zhou, Lei; Hu, Zheng; Weng, Winnie; Subramani, Jayanthi; Tadkod, Vineet; Hamilton, Kortney; Bautista, Ami; Wu, Yu; Chirmule, Narendra; Zhong, Zhandong Don

    2015-10-01

    Biotherapeutics can elicit immune responses, which can alter the exposure, safety, and efficacy of the therapeutics. A well-designed and robust bioanalytical method is critical for the detection and characterization of relevant anti-drug antibody (ADA) and the success of an immunogenicity study. As a fundamental criterion in immunogenicity testing, assay cut points need to be statistically established with a risk-based approach to reduce subjectivity. This manuscript describes the development of a validated, web-based, multi-tier customized assay statistical tool (CAST) for assessing cut points of ADA assays. The tool provides an intuitive web interface that allows users to import experimental data generated from a standardized experimental design, select the assay factors, run the standardized analysis algorithms, and generate tables, figures, and listings (TFL). It allows bioanalytical scientists to perform complex statistical analysis at a click of the button to produce reliable assay parameters in support of immunogenicity studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Creating a mobile subject guide to improve access to point-of-care resources for medical students: a case study

    PubMed Central

    Boruff, Jill T; Bilodeau, Edward

    2012-01-01

    Question: Can a mobile optimized subject guide facilitate medical student access to mobile point-of-care tools? Setting: The guide was created at a library at a research-intensive university with six teaching hospital sites. Objectives: The team created a guide facilitating medical student access to point-of-care tools directly on mobile devices to provide information allowing them to access and set up resources with little assistance. Methods: Two librarians designed a mobile optimized subject guide for medicine and conducted a survey to test its usefulness. Results: Web analytics and survey results demonstrate that the guide is used and the students are satisfied. Conclusion: The library will continue to use the subject guide as its primary means of supporting mobile devices. It remains to be seen if the mobile guide facilitates access for those who do not need assistance and want direct access to the resources. Internet access in the hospitals remains an issue. PMID:22272160

  2. Creating a mobile subject guide to improve access to point-of-care resources for medical students: a case study.

    PubMed

    Boruff, Jill T; Bilodeau, Edward

    2012-01-01

    Can a mobile optimized subject guide facilitate medical student access to mobile point-of-care tools? The guide was created at a library at a research-intensive university with six teaching hospital sites. The team created a guide facilitating medical student access to point-of-care tools directly on mobile devices to provide information allowing them to access and set up resources with little assistance. Two librarians designed a mobile optimized subject guide for medicine and conducted a survey to test its usefulness. Web analytics and survey results demonstrate that the guide is used and the students are satisfied. The library will continue to use the subject guide as its primary means of supporting mobile devices. It remains to be seen if the mobile guide facilitates access for those who do not need assistance and want direct access to the resources. Internet access in the hospitals remains an issue.

  3. Method for providing an arbitrary three-dimensional microstructure in silicon using an anisotropic deep etch

    DOEpatents

    Morales, Alfredo M.; Gonzales, Marcela

    2004-06-15

    The present invention describes a method for fabricating an embossing tool or an x-ray mask tool, providing microstructures that smoothly vary in height from point-to-point in etched substrates, i.e., structure which can vary in all three dimensions. The process uses a lithographic technique to transfer an image pattern in the surface of a silicon wafer by exposing and developing the resist and then etching the silicon substrate. Importantly, the photoresist is variably exposed so that when developed some of the resist layer remains. The remaining undeveloped resist acts as an etchant barrier to the reactive plasma used to etch the silicon substrate and therefore provides the ability etch structures of variable depths.

  4. Applications of low altitude photogrammetry for morphometry, displacements, and landform modeling

    NASA Astrophysics Data System (ADS)

    Gomez, F. G.; Polun, S. G.; Hickcox, K.; Miles, C.; Delisle, C.; Beem, J. R.

    2016-12-01

    Low-altitude aerial surveying is emerging as a tool that greatly improves the ease and efficiency of measuring landforms for quantitative geomorphic analyses. High-resolution, close-range photogrammetry produces dense, 3-dimensional point clouds that facilitate the construction of digital surface models, as well as a potential means of classifying ground targets using spatial structure. This study presents results from recent applications of UAS-based photogrammetry, including high resolution surface morphometry of a lava flow, repeat-pass applications to mass movements, and fault scarp degradation modeling. Depending upon the desired photographic resolution and the platform/payload flown, aerial photos are typically acquired at altitudes of 40 - 100 meters above the ground surface. In all cases, high-precision ground control points are key for accurate (and repeatable) orientation - relying on low-precision GPS coordinates (whether on the ground or geotags in the aerial photos) typically results in substantial rotations (tilt) of the reference frame. Using common ground control points between repeat surveys results in matching point clouds with RMS residuals better than 10 cm. In arid regions, the point cloud is used to assess lava flow surface roughness using multi-scale measurements of point cloud dimensionality. For the landslide study, the point cloud provides a basis for assessing possible displacements. In addition, the high resolution orthophotos facilitate mapping of fractures and their growth. For neotectonic applications, we compare fault scarp modeling results from UAV-derived point clouds versus field-based surveys (kinematic GPS and electronic distance measurements). In summary, there is a wide ranging toolbox of low-altitude aerial platforms becoming available for field geoscientists. In many instances, these tools will present convenience and reduced cost compared with the effort and expense to contract acquisitions of aerial imagery.

  5. The CHESS score: a simple tool for early prediction of shunt dependency after aneurysmal subarachnoid hemorrhage.

    PubMed

    Jabbarli, R; Bohrer, A-M; Pierscianek, D; Müller, D; Wrede, K H; Dammann, P; El Hindy, N; Özkan, N; Sure, U; Müller, O

    2016-05-01

    Acute hydrocephalus is an early and common complication of aneurysmal subarachnoid hemorrhage (SAH). However, considerably fewer patients develop chronic hydrocephalus requiring shunt placement. Our aim was to develop a risk score for early identification of patients with shunt dependency after SAH. Two hundred and forty-two SAH individuals who were treated in our institution between January 2008 and December 2013 and survived the initial impact were retrospectively analyzed. Clinical parameters within 72 h after the ictus were correlated with shunt dependency. Independent predictors were summarized into a new risk score which was validated in a subsequent SAH cohort treated between January and December 2014. Seventy-five patients (31%) underwent shunt placement. Of 23 evaluated variables, only the following five showed independent associations with shunt dependency and were subsequently used to establish the Chronic Hydrocephalus Ensuing from SAH Score (CHESS, 0-8 points): Hunt and Hess grade ≥IV (1 point), location of the ruptured aneurysm in the posterior circulation (1 point), acute hydrocephalus (4 points), the presence of intraventricular hemorrhage (1 point) and early cerebral infarction on follow-up computed tomography scan (1 point). The CHESS showed strong correlation with shunt dependency (P = 0.0007) and could be successfully validated in both internal SAH cohorts tested. Patients scoring ≥6 CHESS points had significantly higher risk of shunt dependency (P < 0.0001) than other patients. The CHESS may become a valuable diagnostic tool for early estimation of shunt dependency after SAH. Further evaluation and external validation will be required in prospective studies. © 2016 EAN.

  6. Discriminating topology in galaxy distributions using network analysis

    NASA Astrophysics Data System (ADS)

    Hong, Sungryong; Coutinho, Bruno C.; Dey, Arjun; Barabási, Albert-L.; Vogelsberger, Mark; Hernquist, Lars; Gebhardt, Karl

    2016-07-01

    The large-scale distribution of galaxies is generally analysed using the two-point correlation function. However, this statistic does not capture the topology of the distribution, and it is necessary to resort to higher order correlations to break degeneracies. We demonstrate that an alternate approach using network analysis can discriminate between topologically different distributions that have similar two-point correlations. We investigate two galaxy point distributions, one produced by a cosmological simulation and the other by a Lévy walk. For the cosmological simulation, we adopt the redshift z = 0.58 slice from Illustris and select galaxies with stellar masses greater than 108 M⊙. The two-point correlation function of these simulated galaxies follows a single power law, ξ(r) ˜ r-1.5. Then, we generate Lévy walks matching the correlation function and abundance with the simulated galaxies. We find that, while the two simulated galaxy point distributions have the same abundance and two-point correlation function, their spatial distributions are very different; most prominently, filamentary structures, absent in Lévy fractals. To quantify these missing topologies, we adopt network analysis tools and measure diameter, giant component, and transitivity from networks built by a conventional friends-of-friends recipe with various linking lengths. Unlike the abundance and two-point correlation function, these network quantities reveal a clear separation between the two simulated distributions; therefore, the galaxy distribution simulated by Illustris is not a Lévy fractal quantitatively. We find that the described network quantities offer an efficient tool for discriminating topologies and for comparing observed and theoretical distributions.

  7. NASA Data for Water Resources Applications

    NASA Technical Reports Server (NTRS)

    Toll, David; Houser, Paul; Arsenault, Kristi; Entin, Jared

    2004-01-01

    Water Management Applications is one of twelve elements in the Earth Science Enterprise National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of: 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies including the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. This includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being eliminated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems WAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification validation. Water Management Applications is one of twelve elements in the Earth Science Enterprise s National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. T us includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being evaluated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems (LDAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification and validation.

  8. The Comparison of Point Data Models for the Output of WRF Hydro Model in the IDV

    NASA Astrophysics Data System (ADS)

    Ho, Y.; Weber, J.

    2017-12-01

    WRF Hydro netCDF output files contain streamflow, flow depth, longitude, latitude, altitude and stream order values for each forecast point. However, the data are not CF compliant. The total number of forecast points for the US CONUS is approximately 2.7 million and it is a big challenge for any visualization and analysis tool. The IDV point cloud display shows point data as a set of points colored by parameter. This display is very efficient compared to a standard point type display for rendering a large number of points. The one problem we have is that the data I/O can be a bottleneck issue when dealing with a large collection of point input files. In this presentation, we will experiment with different point data models and their APIs to access the same WRF Hydro model output. The results will help us construct a CF compliant netCDF point data format for the community.

  9. Smoother Scribing of Silicon Wafers

    NASA Technical Reports Server (NTRS)

    Danyluk, S.

    1986-01-01

    Proposed new tool used to scribe silicon wafers into chips more smoothly than before. New scriber produces surface that appears ductile. Scribed groove cuts have relatively smooth walls. Scriber consists of diamond pyramid point on rigid shaft. Ethanol flows through shaft and around point, like ink in ballpoint pen. Ethanol has significantly different effect for scribing silicon than water, used in conventional diamond scribers.

  10. Development of an interpretive simulation tool for the proton radiography technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, M. C., E-mail: levymc@stanford.edu; Lawrence Livermore National Laboratory, Livermore, California 94551; Ryutov, D. D.

    2015-03-15

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users tomore » add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.« less

  11. Application of the ATLAS score for evaluating the severity of Clostridium difficile infection in teaching hospitals in Mexico.

    PubMed

    Hernández-García, Raúl; Garza-González, Elvira; Miller, Mark; Arteaga-Muller, Giovanna; Galván-de los Santos, Alejandra María; Camacho-Ortiz, Adrián

    2015-01-01

    For clinicians, a practical bedside tool for severity assessment and prognosis of patients with Clostridium difficile infection is a highly desirable unmet medical need. Two general teaching hospitals in northeast Mexico. Adult patients with C. difficile infection. Prospective observational study. Patients included had a median of 48 years of age, 54% of male gender and an average of 24.3 days length of hospital stay. Third generation cephalosporins were the antibiotics most commonly used prior to C. difficile infection diagnosis. Patients diagnosed with C. difficile infection had a median ATLAS score of 4 and 56.7% of the subjects had a score between 4 and 7 points. Patients with a score of 8 through 10 points had 100% mortality. The ATLAS score is a potentially useful tool for the routine evaluation of patients at the time of C. difficile infection diagnosis. At 30 days post-diagnosis, patients with a score of ≤3 points had 100% survival while all of those with scores ≥8 died. Patients with scores between 4 and 7 points had a greater probability of colectomy with an overall cure rate of 70.1%. Copyright © 2015 Elsevier Editora Ltda. All rights reserved.

  12. Identification of design features to enhance utilization and acceptance of systems for Internet-based decision support at the point of care.

    PubMed

    Gadd, C S; Baskaran, P; Lobach, D F

    1998-01-01

    Extensive utilization of point-of-care decision support systems will be largely dependent on the development of user interaction capabilities that make them effective clinical tools in patient care settings. This research identified critical design features of point-of-care decision support systems that are preferred by physicians, through a multi-method formative evaluation of an evolving prototype of an Internet-based clinical decision support system. Clinicians used four versions of the system--each highlighting a different functionality. Surveys and qualitative evaluation methodologies assessed clinicians' perceptions regarding system usability and usefulness. Our analyses identified features that improve perceived usability, such as telegraphic representations of guideline-related information, facile navigation, and a forgiving, flexible interface. Users also preferred features that enhance usefulness and motivate use, such as an encounter documentation tool and the availability of physician instruction and patient education materials. In addition to identifying design features that are relevant to efforts to develop clinical systems for point-of-care decision support, this study demonstrates the value of combining quantitative and qualitative methods of formative evaluation with an iterative system development strategy to implement new information technology in complex clinical settings.

  13. A CLIPS-based tool for aircraft pilot-vehicle interface design

    NASA Technical Reports Server (NTRS)

    Fowler, Thomas D.; Rogers, Steven P.

    1991-01-01

    The Pilot-Vehicle Interface of modern aircraft is the cognitive, sensory, and psychomotor link between the pilot, the avionics modules, and all other systems on board the aircraft. To assist pilot-vehicle interface designers, a C Language Integrated Production System (CLIPS) based tool was developed that allows design information to be stored in a table that can be modified by rules representing design knowledge. Developed for the Apple Macintosh, the tool allows users without any CLIPS programming experience to form simple rules using a point and click interface.

  14. Benchmarking of software tools for optical proximity correction

    NASA Astrophysics Data System (ADS)

    Jungmann, Angelika; Thiele, Joerg; Friedrich, Christoph M.; Pforr, Rainer; Maurer, Wilhelm

    1998-06-01

    The point when optical proximity correction (OPC) will become a routine procedure for every design is not far away. For such a daily use the requirements for an OPC tool go far beyond the principal functionality of OPC that was proven by a number of approaches and is documented well in literature. In this paper we first discuss the requirements for a productive OPC tool. Against these requirements a benchmarking was performed with three different OPC tools available on market (OPRX from TVT, OPTISSIMO from aiss and PROTEUS from TMA). Each of these tools uses a different approach to perform the correction (rules, simulation or model). To assess the accuracy of the correction, a test chip was fabricated, which contains corrections done by each software tool. The advantages and weakness of the several solutions are discussed.

  15. An image guidance system for positioning robotic cochlear implant insertion tools

    NASA Astrophysics Data System (ADS)

    Bruns, Trevor L.; Webster, Robert J.

    2017-03-01

    Cochlear implants must be inserted carefully to avoid damaging the delicate anatomical structures of the inner ear. This has motivated several approaches to improve the safety and efficacy of electrode array insertion by automating the process with specialized robotic or manual insertion tools. When such tools are used, they must be positioned at the entry point to the cochlea and aligned with the desired entry vector. This paper presents an image guidance system capable of accurately positioning a cochlear implant insertion tool. An optical tracking system localizes the insertion tool in physical space while a graphical user interface incorporates this with patient- specific anatomical data to provide error information to the surgeon in real-time. Guided by this interface, novice users successfully aligned the tool with an mean accuracy of 0.31 mm.

  16. Tailoring implementation strategies for evidence-based recommendations using computerised clinical decision support systems: protocol for the development of the GUIDES tools.

    PubMed

    Van de Velde, Stijn; Roshanov, Pavel; Kortteisto, Tiina; Kunnamo, Ilkka; Aertgeerts, Bert; Vandvik, Per Olav; Flottorp, Signe

    2016-03-05

    A computerised clinical decision support system (CCDSS) is a technology that uses patient-specific data to provide relevant medical knowledge at the point of care. It is considered to be an important quality improvement intervention, and the implementation of CCDSS is growing substantially. However, the significant investments do not consistently result in value for money due to content, context, system and implementation issues. The Guideline Implementation with Decision Support (GUIDES) project aims to improve the impact of CCDSS through optimised implementation based on high-quality evidence-based recommendations. To achieve this, we will develop tools that address the factors that determine successful CCDSS implementation. We will develop the GUIDES tools in four steps, using the methods and results of the Tailored Implementation for Chronic Diseases (TICD) project as a starting point: (1) a review of research evidence and frameworks on the determinants of implementing recommendations using CCDSS; (2) a synthesis of a comprehensive framework for the identified determinants; (3) the development of tools for use of the framework and (4) pilot testing the utility of the tools through the development of a tailored CCDSS intervention in Norway, Belgium and Finland. We selected the conservative management of knee osteoarthritis as a prototype condition for the pilot. During the process, the authors will collaborate with an international expert group to provide input and feedback on the tools. This project will provide guidance and tools on methods of identifying implementation determinants and selecting strategies to implement evidence-based recommendations through CCDSS. We will make the GUIDES tools available to CCDSS developers, implementers, researchers, funders, clinicians, managers, educators, and policymakers internationally. The tools and recommendations will be generic, which makes them scalable to a large spectrum of conditions. Ultimately, the better implementation of CCDSS may lead to better-informed decisions and improved care and patient outcomes for a wide range of conditions. PROSPERO, CRD42016033738.

  17. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  18. The development and pilot testing of a rapid assessment tool to improve local public health system capacity in Australia.

    PubMed

    Bagley, Prue; Lin, Vivian

    2009-11-15

    To operate effectively the public health system requires infrastructure and the capacity to act. Public health's ability to attract funding for infrastructure and capacity development would be enhanced if it was able to demonstrate what level of capacity was required to ensure a high performing system. Australia's public health activities are undertaken within a complex organizational framework that involves three levels of government and a diverse range of other organizations. The question of appropriate levels of infrastructure and capacity is critical at each level. Comparatively little is known about infrastructure and capacity at the local level. In-depth interviews were conducted with senior managers in two Australian states with different frameworks for health administration. They were asked to reflect on the critical components of infrastructure and capacity required at the local level. The interviews were analyzed to identify the major themes. Workshops with public health experts explored this data further. The information generated was used to develop a tool, designed to be used by groups of organizations within discrete geographical locations to assess local public health capacity. Local actors in these two different systems pointed to similar areas for inclusion for the development of an instrument to map public health capacity at the local level. The tool asks respondents to consider resources, programs and the cultural environment within their organization. It also asks about the policy environment - recognizing that the broader environment within which organizations operate impacts on their capacity to act. Pilot testing of the tool pointed to some of the challenges involved in such an exercise, particularly if the tool were to be adopted as policy. This research indicates that it is possible to develop a tool for the systematic assessment of public health capacity at the local level. Piloting the tool revealed some concerns amongst participants, particularly about how the tool would be used. However there was also recognition that the areas covered by the tool were those considered relevant.

  19. Relative importance of modularity and other morphological attributes on different types of lithic point weapons: assessing functional variations.

    PubMed

    González-José, Rolando; Charlin, Judith

    2012-01-01

    The specific using of different prehistoric weapons is mainly determined by its physical properties, which provide a relative advantage or disadvantage to perform a given, particular function. Since these physical properties are integrated to accomplish that function, examining design variables and their pattern of integration or modularity is of interest to estimate the past function of a point. Here we analyze a composite sample of lithic points from southern Patagonia likely formed by arrows, thrown spears and hand-held points to test if they can be viewed as a two-module system formed by the blade and the stem, and to evaluate the degree in which shape, size, asymmetry, blade: stem length ratio, and tip angle explain the observed variance and differentiation among points supposedly aimed to accomplish different functions. To do so we performed a geometric morphometric analysis on 118 lithic points, departing from 24 two-dimensional landmark and semi landmarks placed on the point's contour. Klingenberg's covariational modularity tests were used to evaluate different modularity hypotheses, and a composite PCA including shape, size, asymmetry, blade: stem length ratio, and tip angle was used to estimate the importance of each attribute to explaining variation patterns. Results show that the blade and the stem can be seen as "near decomposable units" in the points integrating the studied sample. However, this modular pattern changes after removing the effects of reduction. Indeed, a resharpened point tends to show a tip/rest of the point modular pattern. The composite PCA analyses evidenced three different patterns of morphometric attributes compatible with arrows, thrown spears, and hand-held tools. Interestingly, when analyzed independently, these groups show differences in their modular organization. Our results indicate that stone tools can be approached as flexible designs, characterized by a composite set of interacting morphometric attributes, and evolving on a modular way.

  20. Relative Importance of Modularity and Other Morphological Attributes on Different Types of Lithic Point Weapons: Assessing Functional Variations

    PubMed Central

    González-José, Rolando; Charlin, Judith

    2012-01-01

    The specific using of different prehistoric weapons is mainly determined by its physical properties, which provide a relative advantage or disadvantage to perform a given, particular function. Since these physical properties are integrated to accomplish that function, examining design variables and their pattern of integration or modularity is of interest to estimate the past function of a point. Here we analyze a composite sample of lithic points from southern Patagonia likely formed by arrows, thrown spears and hand-held points to test if they can be viewed as a two-module system formed by the blade and the stem, and to evaluate the degree in which shape, size, asymmetry, blade: stem length ratio, and tip angle explain the observed variance and differentiation among points supposedly aimed to accomplish different functions. To do so we performed a geometric morphometric analysis on 118 lithic points, departing from 24 two-dimensional landmark and semi landmarks placed on the point's contour. Klingenberg's covariational modularity tests were used to evaluate different modularity hypotheses, and a composite PCA including shape, size, asymmetry, blade: stem length ratio, and tip angle was used to estimate the importance of each attribute to explaining variation patterns. Results show that the blade and the stem can be seen as “near decomposable units” in the points integrating the studied sample. However, this modular pattern changes after removing the effects of reduction. Indeed, a resharpened point tends to show a tip/rest of the point modular pattern. The composite PCA analyses evidenced three different patterns of morphometric attributes compatible with arrows, thrown spears, and hand-held tools. Interestingly, when analyzed independently, these groups show differences in their modular organization. Our results indicate that stone tools can be approached as flexible designs, characterized by a composite set of interacting morphometric attributes, and evolving on a modular way. PMID:23094104

  1. Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges

    NASA Technical Reports Server (NTRS)

    He, Matt; Hardin, Danny; Conover, Helen; Graves, Sara; Meyer, Paul; Blakeslee, Richard; Goodman, Michael

    2012-01-01

    Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross ]platform, modular designed JavaScript ]controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  2. Flow in the Proximity of the Pin-Tool in Friction Stir Welding and Its Relation to Weld Homogeneity

    NASA Technical Reports Server (NTRS)

    Nunes, Arthur C., Jr.

    2000-01-01

    In the Friction Stir Welding (FSW) process a rotating pin inserted into a seam literally stirs the metal from each side of the seam together. It is proposed that the flow in the vicinity of the pin-tool comprises a primary rapid shear over a cylindrical envelope covering the pin-tool and a relatively slow secondary flow taking the form of a ring vortex about the tool circumference. This model is consistent with a plastic characterization of metal flow, where discontinuities in shear flow are allowed but not viscous effects. It is consistent with experiments employing several different kinds of tracer: atomic markers, shot, and wire. If a rotating disc with angular velocity w is superposed on a translating continuum with linear velocity omega, the trajectories of tracer points become circular arcs centered upon a point displaced laterally a distance v/omega from the center of rotation of the disc in the direction of the advancing side of the disc. In the present model a stream of metal approaching the tool (taken as the coordinate system of observation) is sheared at the slip surface, rapidly rotated around the tool, sheared again on the opposite side of the tool, and deposited in the wake of the tool. Local shearing rates are high, comparable to metal cutting in this model. The flow patterns in the vicinity of the pin-tool determine the level of homogenization and dispersal of contaminants that occurs in the FSW process. The approaching metal streams enfold one another as they are rotated around the tool. Neglecting mixing they return to the same lateral position in the wake of the tool preserving lateral tracer positions as if the metal had flowed past the tool like an extrusion instead of being rotated around it. (The seam is, however, obliterated.) The metal stream of thickness approximately that of the tool diameter D is wiped past the tool at elevated temperatures drawn out to a thickness of v/2(omega) in the wiping zone. Mixing distances in the wiping zone are multiplied in the unfolded metal. Inhomogeneities on a smaller scale than the mixing length are obliterated, but structure on a larger scale may be transmitted to the wake of a FSW weld.

  3. The Way Point Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Technical Reports Server (NTRS)

    He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John

    2012-01-01

    Airborne real time observation are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientist, planning a research aircraft mission within the context of meeting the science objective is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircraft are often involved in the NASA field campaigns the coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving dynamic weather conditions often determine the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientist and help them plan and modify the flight tracks successfully. Scientists at the University of Alabama Huntsville and the NASA Marshal Space Flight Center developed the Waypoint Planning Tool (WPT), an interactive software tool that enables scientist to develop their own flight plans (also known as waypoints), with point and click mouse capabilities on a digital map filled with time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analyses during and after each campaign helped identify both issues and new requirements, initiating the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities to the Google Earth Plugin and Java Web Start/Applet on web platform, as well as to the rising open source GIS tools with new JavaScript frameworks, the Waypoint planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controled Waypoint tool is planned to be integrated with the NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives. This presentation will discuss the development process of the Waypoint Planning Tool in responding to field campaign challenges, identifying new information technologies, and describing the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  4. Wear-Induced Changes in FSW Tool Pin Profile: Effect of Process Parameters

    NASA Astrophysics Data System (ADS)

    Sahlot, Pankaj; Jha, Kaushal; Dey, G. K.; Arora, Amit

    2018-06-01

    Friction stir welding (FSW) of high melting point metallic (HMPM) materials has limited application due to tool wear and relatively short tool life. Tool wear changes the profile of the tool pin and adversely affects weld properties. A quantitative understanding of tool wear and tool pin profile is crucial to develop the process for joining of HMPM materials. Here we present a quantitative wear study of H13 steel tool pin profile for FSW of CuCrZr alloy. The tool pin profile is analyzed at multiple traverse distances for welding with various tool rotational and traverse speeds. The results indicate that measured wear depth is small near the pin root and significantly increases towards the tip. Near the pin tip, wear depth increases with increase in tool rotational speed. However, change in wear depth near the pin root is minimal. Wear depth also increases with decrease in tool traverse speeds. Tool pin wear from the bottom results in pin length reduction, which is greater for higher tool rotational speeds, and longer traverse distances. The pin profile changes due to wear and result in root defect for long traverse distance. This quantitative understanding of tool wear would be helpful to estimate tool wear, optimize process parameters, and tool pin shape during FSW of HMPM materials.

  5. Innovative Digital Tools and Surveillance Systems for the Timely Detection of Adverse Events at the Point of Care: A Proof-of-Concept Study.

    PubMed

    Hoppe, Christian; Obermeier, Patrick; Muehlhans, Susann; Alchikh, Maren; Seeber, Lea; Tief, Franziska; Karsch, Katharina; Chen, Xi; Boettcher, Sindy; Diedrich, Sabine; Conrad, Tim; Kisler, Bron; Rath, Barbara

    2016-10-01

    Regulatory authorities often receive poorly structured safety reports requiring considerable effort to investigate potential adverse events post hoc. Automated question-and-answer systems may help to improve the overall quality of safety information transmitted to pharmacovigilance agencies. This paper explores the use of the VACC-Tool (ViVI Automated Case Classification Tool) 2.0, a mobile application enabling physicians to classify clinical cases according to 14 pre-defined case definitions for neuroinflammatory adverse events (NIAE) and in full compliance with data standards issued by the Clinical Data Interchange Standards Consortium. The validation of the VACC-Tool 2.0 (beta-version) was conducted in the context of a unique quality management program for children with suspected NIAE in collaboration with the Robert Koch Institute in Berlin, Germany. The VACC-Tool was used for instant case classification and for longitudinal follow-up throughout the course of hospitalization. Results were compared to International Classification of Diseases , Tenth Revision (ICD-10) codes assigned in the emergency department (ED). From 07/2013 to 10/2014, a total of 34,368 patients were seen in the ED, and 5243 patients were hospitalized; 243 of these were admitted for suspected NIAE (mean age: 8.5 years), thus participating in the quality management program. Using the VACC-Tool in the ED, 209 cases were classified successfully, 69 % of which had been missed or miscoded in the ED reports. Longitudinal follow-up with the VACC-Tool identified additional NIAE. Mobile applications are taking data standards to the point of care, enabling clinicians to ascertain potential adverse events in the ED setting and during inpatient follow-up. Compliance with Clinical Data Interchange Standards Consortium (CDISC) data standards facilitates data interoperability according to regulatory requirements.

  6. Part 2: pressure ulcer assessment: implementation and revision of CALCULATE.

    PubMed

    Richardson, Annette; Straughan, Christine

    2015-11-01

    Critically ill patients are a vulnerable group at very high risk of developing pressure ulcers, and the incidence varies within critical care. A number of strategies were used to implement the pressure ulcer assessment tool CALCULATE across four adult critical care units. Strategies included, nursing leadership, the provision of definitions for each risk factor, information laid out on posters at each patient's bedside, changes to pre-printed nursing documentation and a 30-min focused training package. Two local audits were conducted to measure the number and types of risk factors occurring in patients with pressure ulcers, and to assess the frequency of assessments and gain feedback on the usability of the tool in practice. Critical care acquired pressure ulcer incidence was 3.4%. The two most commonly occurring risk factors were impaired circulation (82%) and mechanical ventilation (75%). Patients had a mean score of 4, and 65% had 4 or more reported risk factors. Feedback on the usability of the tool was mainly positive. The tool CALCULATE was relatively straightforward to implement and was likely to be due to the design and the various change strategies used to implement the new approach. The seven point tool was revised to an eight point score based on nurses' clinical feedback. Research is required to further enhance and develop pressure ulcer assessment. Meanwhile CALCULATE offers an easy to use and appropriate tool to assist in the identification of patients at an elevated risk of pressure ulcer damage. Careful choice of change management strategies are needed when implementing a new assessment tool. CALCULATE should be considered for use in critical care for pressure ulcer assessment, but used alongside nurses' clinical judgement and observations of skin. © 2015 British Association of Critical Care Nurses.

  7. ChemEd X Data: Exposing Students to Open Scientific Data for Higher-Order Thinking and Self-Regulated Learning

    ERIC Educational Resources Information Center

    Eklund, Brandon; Prat-Resina, Xavier

    2014-01-01

    ChemEd X Data is an open web tool that collects and curates physical and chemical data of hundreds of substances. This tool allows students to navigate, select, and graphically represent data such as boiling and melting points, enthalpies of combustion, and heat capacities for hundreds of molecules. By doing so, students can independently identify…

  8. The Measurement of Students' Achievement in Teaching Primary School Fifth Year Mathematics Classes

    ERIC Educational Resources Information Center

    Doganay, Ahmet; Bal, Ayten Pinar

    2010-01-01

    The aim of this study was to investigate students' and teachers' point of views about preparing measurement tools used in mathematics classes, the level of learning that these tools are intended to measure, how often they are used and how they are scored in terms of assessing 5th grade primary school mathematic courses. The population of the study…

  9. Assessing the Efficacy of the Measure of Understanding of Macroevolution as a Valid Tool for Undergraduate Non-Science Majors

    ERIC Educational Resources Information Center

    Romine, William Lee; Walter, Emily Marie

    2014-01-01

    Efficacy of the Measure of Understanding of Macroevolution (MUM) as a measurement tool has been a point of contention among scholars needing a valid measure for knowledge of macroevolution. We explored the structure and construct validity of the MUM using Rasch methodologies in the context of a general education biology course designed with an…

  10. Material Teaching Aids: Enhancement Tool for Teaching Essay Writing in Secondary Schools

    ERIC Educational Resources Information Center

    Fidelia, Okonkwo Adaobi

    2015-01-01

    The purpose of this study is to investigate the use of material teaching aids as enhancement tool for teaching essay writing in secondary schools in Ebonyi State. A 4-point Likert-scale questionnaire was used as the instrument. A trial test was conducted and tested for reliability and a value of 0.75 was obtained from the test. The instrument was…

  11. NASA Research to Support the Airlines

    NASA Technical Reports Server (NTRS)

    Mogford, Richard

    2016-01-01

    This is a PowerPoint presentation that was a review of NASA projects that support airline operations. It covered NASA tasks that have provided new tools to the airline operations center and flight deck including the Flight Awareness Collaboration Tool, Dynamic Weather Routes, Traffic Aware Strategic Aircrew Requests, and Airplane State Awareness and Prediction Technologies. This material is very similar to other previously approved presentations with the same title.

  12. Semantically Grounded Briefings

    DTIC Science & Technology

    2005-12-01

    cascading interface, mirroring the class inheritance of the ontologies. Clicking on one of these tools, like PowerPoint’s native autoshape tools...connections are their graphic templates. This determines the appearance of an instance of that concept. Any of PowerPoint’s native autoshapes , formatted...which can be any PowerPoint autoshape , group shape, or image • Identification of a modulated component of C’s graphic template. If C’s graphic

  13. Development of Gis Tool for the Solution of Minimum Spanning Tree Problem using Prim's Algorithm

    NASA Astrophysics Data System (ADS)

    Dutta, S.; Patra, D.; Shankar, H.; Alok Verma, P.

    2014-11-01

    minimum spanning tree (MST) of a connected, undirected and weighted network is a tree of that network consisting of all its nodes and the sum of weights of all its edges is minimum among all such possible spanning trees of the same network. In this study, we have developed a new GIS tool using most commonly known rudimentary algorithm called Prim's algorithm to construct the minimum spanning tree of a connected, undirected and weighted road network. This algorithm is based on the weight (adjacency) matrix of a weighted network and helps to solve complex network MST problem easily, efficiently and effectively. The selection of the appropriate algorithm is very essential otherwise it will be very hard to get an optimal result. In case of Road Transportation Network, it is very essential to find the optimal results by considering all the necessary points based on cost factor (time or distance). This paper is based on solving the Minimum Spanning Tree (MST) problem of a road network by finding it's minimum span by considering all the important network junction point. GIS technology is usually used to solve the network related problems like the optimal path problem, travelling salesman problem, vehicle routing problems, location-allocation problems etc. Therefore, in this study we have developed a customized GIS tool using Python script in ArcGIS software for the solution of MST problem for a Road Transportation Network of Dehradun city by considering distance and time as the impedance (cost) factors. It has a number of advantages like the users do not need a greater knowledge of the subject as the tool is user-friendly and that allows to access information varied and adapted the needs of the users. This GIS tool for MST can be applied for a nationwide plan called Prime Minister Gram Sadak Yojana in India to provide optimal all weather road connectivity to unconnected villages (points). This tool is also useful for constructing highways or railways spanning several cities optimally or connecting all cities with minimum total road length.

  14. Development of PAOT tool kit for work improvements in clinical nursing.

    PubMed

    Jung, Moon-Hee

    2014-01-01

    The aim of this study was to develop an action checklist for educational training of clinical nurses. The study used qualitative and quantitative methods. Questionnaire items were extracted through in-depth interviews and a questionnaire survey. PASW version 19 and AMOS version 19 were used for data analyses. Reliability and validity were tested with both exploratory and confirmative factor analysis. The levels of the indicators related to goodness-of-fit were acceptable. Thus, a model kit of work improvements in clinical nursing was developed. It comprises 5 domains (16 action points): health promotion (5 action points), work management (3 action points), ergonomic work methods (3 action points), managerial policies and mutual support among staff members (3 action points), and welfare in the work area (2 action points).

  15. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  16. Predictors of Recurrent Falls in People with Parkinson's Disease and Proposal for a Predictive Tool.

    PubMed

    Almeida, Lorena R S; Valenca, Guilherme T; Negreiros, Nádja N; Pinto, Elen B; Oliveira-Filho, Jamary

    2017-01-01

    Falls are a debilitating problem for people with Parkinson's disease (PD). To compare clinical and functional characteristics of non-fallers, single and recurrent fallers (≥2 falls); to determine predictors of time to second fall; and to develop a predictive tool for identifying people with PD at different categories of falls risk. Participants (n = 229) were assessed by disease-specific, self-report and balance measures and followed up for 12 months. Area under the receiver operating characteristic curves (AUC), Kaplan-Meier curves and log-rank test were performed. Selected predictors with p < 0.10 in univariate analysis were chosen to be entered into the Cox regression model. Eighty-four (37%) participants had ≥2 falls during the follow-up. Recurrent fallers significantly differed from single fallers. The final Cox model included history of ≥2 falls in the past year (Hazard Ratio [HR] = 3.94; 95% confidence interval [CI] 2.26-6.86), motor fluctuations (HR = 1.91; 95% CI 1.12-3.26), UPDRS activities of daily living (ADL) (HR = 1.10 per 1 point increase; 95% CI 1.06-1.14) and levodopa equivalent dose (LED) (HR = 1.09 per 100 mg increase; 95% CI 1.02-1.16). A 3-predictor tool included history of ≥2 falls in the past year, motor fluctuations and UPDRS ADL >12 points (AUC = 0.84; 95% CI 0.78-0.90). By adding LED >700 mg/day and Berg balance scale ≤49 points, a 5-predictor tool was developed (AUC = 0.86; 95% CI 0.81-0.92). Two predictive tools with moderate-to-high accuracy may identify people with PD at low, medium and high risk of falling recurrently within the next year. However, future studies to address external validation are required.

  17. 41 CFR 50-204.5 - Machine guarding.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... usually require point of operation guarding: Guillotine cutters. Shears. Alligator shears. Power presses. Milling machines. Power saws. Jointers. Portable power tools. Forming rolls and calenders. (d) Revolving...

  18. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    NASA Technical Reports Server (NTRS)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  19. Hough transform as a tool support building roof detection. (Polish Title: Transformata Hough'a jako narzędzie wspomagające wykrywanie dachów budynków)

    NASA Astrophysics Data System (ADS)

    Borowiec, N.

    2013-12-01

    Gathering information about the roof shapes of the buildings is still current issue. One of the many sources from which we can obtain information about the buildings is the airborne laser scanning. However, detect information from cloud o points about roofs of building automatically is still a complex task. You can perform this task by helping the additional information from other sources, or based only on Lidar data. This article describes how to detect the building roof only from a point cloud. To define the shape of the roof is carried out in three tasks. The first step is to find the location of the building, the second is the precise definition of the edge, while the third is an indication of the roof planes. First step based on the grid analyses. And the next two task based on Hough Transformation. Hough transformation is a method of detecting collinear points, so a perfect match to determine the line describing a roof. To properly determine the shape of the roof is not enough only the edges, but it is necessary to indicate roofs. Thus, in studies Hough Transform, also served as a tool for detection of roof planes. The only difference is that the tool used in this case is a three-dimensional.

  20. Evaluation of the Predictive Index for Osteoporosis as a Clinical Tool to Identify the Risk of Osteoporosis in Korean Men by Using the Korea National Health and Nutrition Examination Survey Data.

    PubMed

    Moon, Ji Hyun; Kim, Lee Oh; Kim, Hyeon Ju; Kong, Mi Hee

    2016-11-01

    We previously proposed the Predictive Index for Osteoporosis as a new index to identify men who require bone mineral density measurement. However, the previous study had limitations such as a single-center design and small sample size. Here, we evaluated the usefulness of the Predictive Index for Osteoporosis using the nationally representative data of the Korea National Health and Nutrition Examination Survey. Participants underwent bone mineral density measurements via dual energy X-ray absorptiometry, and the Predictive Index for Osteoporosis and Osteoporosis Self-Assessment Tool for Asians were assessed. Receiver operating characteristic analysis was used to obtain optimal cut-off points for the Predictive Index for Osteoporosis and Osteoporosis Self-Assessment Tool for Asians, and the predictability of osteoporosis for the 2 indices was compared. Both indices were useful clinical tools for identifying osteoporosis risk in Korean men. The optimal cut-off value for the Predictive Index for Osteoporosis was 1.07 (sensitivity, 67.6%; specificity, 72.7%; area under the curve, 0.743). When using a cut-off point of 0.5 for the Osteoporosis Self-Assessment Tool for Asians, the sensitivity and specificity were 71.9% and 64.0%, respectively, and the area under the curve was 0.737. The Predictive Index for Osteoporosis was as useful as the Osteoporosis Self-Assessment Tool for Asians as a screening index to identify candidates for dual energy X-ray absorptiometry among men aged 50-69 years.

  1. Acceptance and practicability of a visual communication tool in smoking cessation counselling: a randomised controlled trial.

    PubMed

    Neuner-Jehle, Stefan; Knecht, Marianne I; Stey-Steurer, Claudia; Senn, Oliver

    2013-12-01

    Smoking cessation advice is important for reducing the worldwide burden of disease resulting from tobacco smoking. Appropriate risk communication formats improve the success of counselling interventions in primary care. To test the feasibility and acceptance of a smoking cessation counselling tool with different cardiovascular risk communication formats including graphs, in comparison with the International Primary Care Respiratory Group (IPCRG) 'quit smoking assistance' tool. GPs were randomised into an intervention group (using our communication tool in addition to the IPCRG sheet) and a control group (using the IPCRG sheet only). We asked participants for socioeconomic data, smoking patterns, understanding of information, motivation, acceptance and feasibility, and measured the duration and frequency of counselling sessions. Twenty-five GPs performed 2.8 counselling sessions per month in the intervention group and 1.7 in the control group (p=0.3) with 114 patients. The median duration of a session was 10 mins (control group 11 mins, p=0.09 for difference). Median patients' motivation for smoking cessation was 7 on a 10-point visual analogue scale with no significant difference before and after the intervention (p=0.2) or between groups (p=0.73 before and p=0.15 after the intervention). Median patients' ratings of motivation, selfconfidence, understanding of information, and satisfaction with the counselling were 3-5 on a 5-point Likert scale, similar to GPs' ratings of acceptance and feasibility, with no significant difference between groups. Among Swiss GPs and patients, both our innovative communication tool and the IPCRG tool were well accepted and both merit further dissemination and application in research.

  2. Methods to Predict Stresses in Cutting Inserts Brazed Using Iron-Carbon Brazing Alloy

    NASA Astrophysics Data System (ADS)

    Konovodov, V. V.; Valentov, A. V.; Retuynskiy, O. Yu; Esekuev, Sh B.

    2016-04-01

    This work describes a method for predicting residual and operating stresses in a flat-form tool insert made of tungsten free carbides brazed using iron-carbon alloy. According to the studies’ results it is concluded that the recommendations relating to the limitation of a melting point of tool brazing alloys (950-1100°C according to different data) are connected with a negative impact on tools as a composite made of dissimilar materials rather than on hard alloys as a tool material. Due to the cooling process stresses inevitably occur in the brazed joint of dissimilar materials, and these stresses increase with the higher solidification temperature of the brazing alloy.

  3. U.S. Geological Survey ArcMap Sediment Classification tool

    USGS Publications Warehouse

    O'Malley, John

    2007-01-01

    The U.S. Geological Survey (USGS) ArcMap Sediment Classification tool is a custom toolbar that extends the Environmental Systems Research Institute, Inc. (ESRI) ArcGIS 9.2 Desktop application to aid in the analysis of seabed sediment classification. The tool uses as input either a point data layer with field attributes containing percentage of gravel, sand, silt, and clay or four raster data layers representing a percentage of sediment (0-100%) for the various sediment grain size analysis: sand, gravel, silt and clay. This tool is designed to analyze the percent of sediment at a given location and classify the sediments according to either the Folk (1954, 1974) or Shepard (1954) as modified by Schlee(1973) classification schemes. The sediment analysis tool is based upon the USGS SEDCLASS program (Poppe, et al. 2004).

  4. COMETBOARDS Can Optimize the Performance of a Wave-Rotor-Topped Gas Turbine Engine

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.

    1997-01-01

    A wave rotor, which acts as a high-technology topping spool in gas turbine engines, can increase the effective pressure ratio as well as the turbine inlet temperature in such engines. The wave rotor topping, in other words, may significantly enhance engine performance by increasing shaft horse power while reducing specific fuel consumption. This performance enhancement requires optimum selection of the wave rotor's adjustable parameters for speed, surge margin, and temperature constraints specified on different engine components. To examine the benefit of the wave rotor concept in engine design, researchers soft coupled NASA Lewis Research Center's multidisciplinary optimization tool COMETBOARDS and the NASA Engine Performance Program (NEPP) analyzer. The COMETBOARDS-NEPP combined design tool has been successfully used to optimize wave-rotor-topped engines. For illustration, the design of a subsonic gas turbine wave-rotor-enhanced engine with four ports for 47 mission points (which are specified by Mach number, altitude, and power-setting combinations) is considered. The engine performance analysis, constraints, and objective formulations were carried out through NEPP, and COMETBOARDS was used for the design optimization. So that the benefits that accrue from wave rotor enhancement could be examined, most baseline variables and constraints were declared to be passive, whereas important parameters directly associated with the wave rotor were considered to be active for the design optimization. The engine thrust was considered as the merit function. The wave rotor engine design, which became a sequence of 47 optimization subproblems, was solved successfully by using a cascade strategy available in COMETBOARDS. The graph depicts the optimum COMETBOARDS solutions for the 47 mission points, which were normalized with respect to standard results. As shown, the combined tool produced higher thrust for all mission points than did the other solution, with maximum benefits around mission points 11, 25, and 31. Such improvements can become critical, especially when engines are sized for these specific mission points.

  5. Determining the cut-off point of osteoporosis based on the osteoporosis self-assessment tool, body mass index and weight in Taiwanese young adult women.

    PubMed

    Chang, Shu Fang; Yang, Rong Sen

    2014-09-01

    To examine the cut-off point of the osteoporosis self-assessment tool, age, weight and body mass index for osteoporosis among young adult Taiwanese women, using a large-scale health examination database containing bone mineral density tests. The cut-off points of osteoporosis risk factors identified earlier focus on menopausal or senior Caucasian and Asian women. However, young adult Asian women have seldom been identified. A retrospective historical cohort study. Using the 2009-2011 health examination database of a large-scale medical centre in northern Taiwan, this study investigated young adult Asian women (i.e. range in age from 30-49 years) in Taiwan who had received dual-energy X-ray absorptiometry test. This study also explored the cut-off point, sensitivity, specificity and diagnostic accuracy of receiver operating characteristics of osteoporosis among young adult females in Taiwan. This study collected 2454 young adult Asian women in Taiwan. Cochran-Armitage analysis results indicated that the prevalence of osteoporosis increased with decreasing weight, body mass index and osteoporosis self-assessment method quartiles. According to the results of receiver operating characteristics, weight, body mass index and osteoporosis self-assessment tool approaches can generally be used as indicators to predict osteoporosis among young adult Asian women. Results of this study demonstrate that Taiwanese women contracting osteoporosis tend to be young and underweight, as well as having a low body mass index and osteoporosis self-assessment scores. Those results further suggest that the assessment indicators for cut-off points are appropriately suitable for young adult women in Taiwan. Early detection is the only available means of preventing osteoporosis. Professional nurses should apply convenient and accurate assessment procedures to help young adult women to adopt preventive strategies against osteoporosis early, thus eliminating the probability of osteoporotic fracture. © 2013 John Wiley & Sons Ltd.

  6. Mapping with Small UAS: A Point Cloud Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Toth, Charles; Jozkow, Grzegorz; Grejner-Brzezinska, Dorota

    2015-12-01

    Interest in using inexpensive Unmanned Aerial System (UAS) technology for topographic mapping has recently significantly increased. Small UAS platforms equipped with consumer grade cameras can easily acquire high-resolution aerial imagery allowing for dense point cloud generation, followed by surface model creation and orthophoto production. In contrast to conventional airborne mapping systems, UAS has limited ground coverage due to low flying height and limited flying time, yet it offers an attractive alternative to high performance airborne systems, as the cost of the sensors and platform, and the flight logistics, is relatively low. In addition, UAS is better suited for small area data acquisitions and to acquire data in difficult to access areas, such as urban canyons or densely built-up environments. The main question with respect to the use of UAS is whether the inexpensive consumer sensors installed in UAS platforms can provide the geospatial data quality comparable to that provided by conventional systems. This study aims at the performance evaluation of the current practice of UAS-based topographic mapping by reviewing the practical aspects of sensor configuration, georeferencing and point cloud generation, including comparisons between sensor types and processing tools. The main objective is to provide accuracy characterization and practical information for selecting and using UAS solutions in general mapping applications. The analysis is based on statistical evaluation as well as visual examination of experimental data acquired by a Bergen octocopter with three different image sensor configurations, including a GoPro HERO3+ Black Edition, a Nikon D800 DSLR and a Velodyne HDL-32. In addition, georeferencing data of varying quality were acquired and evaluated. The optical imagery was processed by using three commercial point cloud generation tools. Comparing point clouds created by active and passive sensors by using different quality sensors, and finally, by different commercial software tools, provides essential information for the performance validation of UAS technology.

  7. Modeling forest bird species' likelihood of occurrence in Utah with Forest Inventory and Analysis and Landfire map products and ecologically based pseudo-absence points

    Treesearch

    Phoebe L. Zarnetske; Thomas C., Jr. Edwards; Gretchen G. Moisen

    2007-01-01

    Estimating species likelihood of occurrence across extensive landscapes is a powerful management tool. Unfortunately, available occurrence data for landscape-scale modeling is often lacking and usually only in the form of observed presences. Ecologically based pseudo-absence points were generated from within habitat envelopes to accompany presence-only data in habitat...

  8. Marketing: The roots of your business

    Treesearch

    Susan S. Franko

    2008-01-01

    These tools will help you turn the features of your products and services into benefits. A feature is defined from your point of view; a benefit is defined from the customer's point of view. The potential customer has to be helped to understand why you are the right choice for him or her. In this way, you lead them to the decision you want them to make, that is,...

  9. Mind Mirror Projects: A Tool for Integrating Critical Thinking into the English Language Classroom

    ERIC Educational Resources Information Center

    Tully, Matthew M.

    2009-01-01

    Identifying a point of view can be a complex task in any language. By analyzing what characters say, think, and do throughout a story, readers can observe how points of view tend to change over time. Easier said than done, this ability to climb inside the mind of a character can help students as they analyze personalities found in literature,…

  10. A Comparison of Methods for Estimating Relationships in the Change between Two Time Points for Latent Variables

    ERIC Educational Resources Information Center

    Finch, W. Holmes; Shim, Sungok Serena

    2018-01-01

    Collection and analysis of longitudinal data is an important tool in understanding growth and development over time in a whole range of human endeavors. Ideally, researchers working in the longitudinal framework are able to collect data at more than two points in time, as this will provide them with the potential for a deeper understanding of the…

  11. Technology in College Classrooms: An Action Research Examining the Use of Powerpoint in ELL Classrooms

    ERIC Educational Resources Information Center

    Zhang, Weiwei

    2012-01-01

    This research looks at the use of PowerPoint as an instructional tool for teaching English language learners (ELL) who studied in a language program at a state university in the Pacific Northwest. The purpose of the research was to discover and to explore the perceptions of PowerPoint supported teaching and learning that were held by the students,…

  12. Different Social Motives in the Gestural Communication of Chimpanzees and Human Children

    ERIC Educational Resources Information Center

    Bullinger, Anke F.; Zimmermann, Felizitas; Kaminski, Juliane; Tomasello, Michael

    2011-01-01

    Both chimpanzees and human infants use the pointing gesture with human adults, but it is not clear if they are doing so for the same social motives. In two studies, we presented chimpanzees and human 25-month-olds with the opportunity to point for a hidden tool (in the presence of a non-functional distractor). In one condition it was clear that…

  13. MLS data segmentation using Point Cloud Library procedures. (Polish Title: Segmentacja danych MLS z użyciem procedur Point Cloud Library)

    NASA Astrophysics Data System (ADS)

    Grochocka, M.

    2013-12-01

    Mobile laser scanning is dynamically developing measurement technology, which is becoming increasingly widespread in acquiring three-dimensional spatial information. Continuous technical progress based on the use of new tools, technology development, and thus the use of existing resources in a better way, reveals new horizons of extensive use of MLS technology. Mobile laser scanning system is usually used for mapping linear objects, and in particular the inventory of roads, railways, bridges, shorelines, shafts, tunnels, and even geometrically complex urban spaces. The measurement is done from the perspective of use of the object, however, does not interfere with the possibilities of movement and work. This paper presents the initial results of the segmentation data acquired by the MLS. The data used in this work was obtained as part of an inventory measurement infrastructure railway line. Measurement of point clouds was carried out using a profile scanners installed on the railway platform. To process the data, the tools of 'open source' Point Cloud Library was used. These tools allow to use templates of programming libraries. PCL is an open, independent project, operating on a large scale for processing 2D/3D image and point clouds. Software PCL is released under the terms of the BSD license (Berkeley Software Distribution License), which means it is a free for commercial and research use. The article presents a number of issues related to the use of this software and its capabilities. Segmentation data is based on applying the templates library pcl_ segmentation, which contains the segmentation algorithms to separate clusters. These algorithms are best suited to the processing point clouds, consisting of a number of spatially isolated regions. Template library performs the extraction of the cluster based on the fit of the model by the consensus method samples for various parametric models (planes, cylinders, spheres, lines, etc.). Most of the mathematical operation is carried out on the basis of Eigen library, a set of templates for linear algebra.

  14. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  15. Machinability of titanium metal matrix composites (Ti-MMCs)

    NASA Astrophysics Data System (ADS)

    Aramesh, Maryam

    Titanium metal matrix composites (Ti-MMCs), as a new generation of materials, have various potential applications in aerospace and automotive industries. The presence of ceramic particles enhances the physical and mechanical properties of the alloy matrix. However, the hard and abrasive nature of these particles causes various issues in the field of their machinability. Severe tool wear and short tool life are the most important drawbacks of machining this class of materials. There is very limited work in the literature regarding the machinability of this class of materials especially in the area of tool life estimation and tool wear. By far, polycrystalline diamond (PCD) tools appear to be the best choice for machining MMCs from researchers' point of view. However, due to their high cost, economical alternatives are sought. Cubic boron nitride (CBN) inserts, as the second hardest available tools, show superior characteristics such as great wear resistance, high hardness at elevated temperatures, a low coefficient of friction and a high melting point. Yet, so far CBN tools have not been studied during machining of Ti-MMCs. In this study, a comprehensive study has been performed to explore the tool wear mechanisms of CBN inserts during turning of Ti-MMCs. The unique morphology of the worn faces of the tools was investigated for the first time, which led to new insights in the identification of chemical wear mechanisms during machining of Ti-MMCs. Utilizing the full tool life capacity of cutting tools is also very crucial, due to the considerable costs associated with suboptimal replacement of tools. This strongly motivates development of a reliable model for tool life estimation under any cutting conditions. In this study, a novel model based on the survival analysis methodology is developed to estimate the progressive states of tool wear under any cutting conditions during machining of Ti-MMCs. This statistical model takes into account the machining time in addition to the effect of cutting parameters. Thus, promising results were obtained which showed a very good agreement with the experimental results. Moreover, a more advanced model was constructed, by adding the tool wear as another variable to the previous model. Therefore, a new model was proposed for estimating the remaining life of worn inserts under different cutting conditions, using the current tool wear data as an input. The results of this model were validated with the experimental results. The estimated results were well consistent with the results obtained from the experiments.

  16. CIM at GE's factory of the future

    NASA Astrophysics Data System (ADS)

    Waldman, H.

    Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.

  17. Program Correctness, Verification and Testing for Exascale (Corvette)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Koushik; Iancu, Costin; Demmel, James W

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less

  18. Stress analysis and design considerations for Shuttle pointed autonomous research tool for astronomy /SPARTAN/

    NASA Technical Reports Server (NTRS)

    Ferragut, N. J.

    1982-01-01

    The Shuttle Pointed Autonomous Research Tool for Astronomy (SPARTAN) family of spacecraft are intended to operate with minimum interfaces with the U.S. Space Shuttle in order to increase flight opportunities. The SPARTAN I Spacecraft was designed to enhance structural capabilities and increase reliability. The approach followed results from work experience which evolved from sounding rocket projects. Structural models were developed to do the analyses necessary to satisfy safety requirements for Shuttle hardware. A loads analysis must also be performed. Stress analysis calculations will be performed on the main structural elements and subcomponents. Attention is given to design considerations and program definition, the schematic representation of a finite element model used for SPARTAN I spacecraft, details of loads analysis, the stress analysis, and fracture mechanics plan implications.

  19. OpenEIS. Developer Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutes, Robert G.; Neubauer, Casey C.; Haack, Jereme N.

    2015-03-31

    The Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of an open-source software tool for analyzing building energy and operational data: OpenEIS (open energy information system). This tool addresses the problems of both owners of building data and developers of tools to analyze this data. Building owners and managers have data but lack the tools to analyze it while tool developers lack data in a common format to ease development of reusable data analysis tools. This document is intended for developers of applications and explains the mechanisms for building analysis applications, accessing data, and displaying datamore » using a visualization from the included library. A brief introduction to the visualizations can be used as a jumping off point for developers familiar with JavaScript to produce their own. Several example applications are included which can be used along with this document to implement algorithms for performing energy data analysis.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Application analysis is facilitated through a number of program profiling tools. The tools vary in their complexity, ease of deployment, design, and profiling detail. Specifically, understand- ing, analyzing, and optimizing is of particular importance for scientific applications where minor changes in code paths and data-structure layout can have profound effects. Understanding how intricate data-structures are accessed and how a given memory system responds is a complex task. In this paper we describe a trace profiling tool, Glprof, specifically aimed to lessen the burden of the programmer to pin-point heavily involved data-structures during an application's run-time, and understand data-structure run-time usage.more » Moreover, we showcase the tool's modularity using additional cache simulation components. We elaborate on the tool's design, and features. Finally we demonstrate the application of our tool in the context of Spec bench- marks using the Glprof profiler and two concurrently running cache simulators, PPC440 and AMD Interlagos.« less

  1. Attribute Studies of Points, Perforators, Knives, and Lithic Caches from Ayn Ab$$\\bar{u}$$ Nukhayla

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nowell, April; Gutzeit, Jennifer L.; Bell, Colleen

    This is an in-depth study of two distinct tool types recovered from the Early Neolithic site of Ayn Abu Nukhayla, located in southern Jordan. This occupation dates to 9,500 to 7,500 BP in the Pre-Pottery Neolithic B and is comprised of regionally varied settlements reflecting a range of economic adaptations. The two tool types of concern are Nahal Hemar Knives (so named for their resemblance to a similar tool found in the Israeli site of Nahal Hemar Cave) and Nukhayla perforators. The analyses focus on creating an overall description of the tool assemblages themselves while also attempting to identify changesmore » in tool morphology through space and time. The tools are compared with similar types from across the Neolithic Levant in an attempt to draw comparisons between the assemblages found at Ayn Abu Nykhayla and other sites from the same period.« less

  2. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  3. Performance of Compiler-Assisted Memory Safety Checking

    DTIC Science & Technology

    2014-08-01

    software developer has in mind a particular object to which the pointer should point, the intended referent. A memory access error occurs when an ac...Performance of Compiler-Assisted Memory Safety Checking David Keaton Robert C. Seacord August 2014 TECHNICAL NOTE CMU/SEI-2014-TN...based memory safety checking tool and the performance that can be achieved with two such tools whose source code is freely available. The note then

  4. Distributed Automated Medical Robotics to Improve Medical Field Operations

    DTIC Science & Technology

    2010-04-01

    ROBOT PATIENT INTERFACE Robotic trauma diagnosis and intervention is performed using instruments and tools mounted on the end of a robotic manipulator...manipulator to respond quickly enough to accommodate for motion due to high inertia and inaccuracies caused by low stiffness at the tool point. Ultrasonic...program was licensed to Intuitive Surgical, Inc and subsequently morphed into the daVinci surgical system. The daVinci has been widely applied in

  5. BMDExpress Data Viewer: A Visualization Tool to Analyze ...

    EPA Pesticide Factsheets

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM

  6. Aspects of ultra-high-precision diamond machining of RSA 443 optical aluminium

    NASA Astrophysics Data System (ADS)

    Mkoko, Z.; Abou-El-Hossein, K.

    2015-08-01

    Optical aluminium alloys such as 6061-T6 are traditionally used in ultra-high precision manufacturing for making optical mirrors for aerospace and other applications. However, the optics industry has recently witnessed the development of more advanced optical aluminium grades that are capable of addressing some of the issues encountered when turning with single-point natural monocrystalline diamond cutters. The advent of rapidly solidified aluminium (RSA) grades has generally opened up new possibilities for ultra-high precision manufacturing of optical components. In this study, experiments were conducted with single-point diamond cutters on rapidly solidified aluminium RSA 443 material. The objective of this study is to observe the effects of depth of cut and feed rate at a fixed rotational speed on the tool wear rate and resulting surface roughness of diamond turned specimens. This is done to gain further understanding of the rate of wear on the diamond cutters versus the surface texture generated on the RSA 443 material. The diamond machining experiments yielded machined surfaces which are less reflective but with consistent surface roughness values. Cutting tools were observed for wear through scanning microscopy; relatively low wear pattern was evident on the diamond tool edge. The highest tool wear were obtained at higher depth of cut and increased feed rate.

  7. High-frequency video capture and a computer program with frame-by-frame angle determination functionality as tools that support judging in artistic gymnastics.

    PubMed

    Omorczyk, Jarosław; Nosiadek, Leszek; Ambroży, Tadeusz; Nosiadek, Andrzej

    2015-01-01

    The main aim of this study was to verify the usefulness of selected simple methods of recording and fast biomechanical analysis performed by judges of artistic gymnastics in assessing a gymnast's movement technique. The study participants comprised six artistic gymnastics judges, who assessed back handsprings using two methods: a real-time observation method and a frame-by-frame video analysis method. They also determined flexion angles of knee and hip joints using the computer program. In the case of the real-time observation method, the judges gave a total of 5.8 error points with an arithmetic mean of 0.16 points for the flexion of the knee joints. In the high-speed video analysis method, the total amounted to 8.6 error points and the mean value amounted to 0.24 error points. For the excessive flexion of hip joints, the sum of the error values was 2.2 error points and the arithmetic mean was 0.06 error points during real-time observation. The sum obtained using frame-by-frame analysis method equaled 10.8 and the mean equaled 0.30 error points. Error values obtained through the frame-by-frame video analysis of movement technique were higher than those obtained through the real-time observation method. The judges were able to indicate the number of the frame in which the maximal joint flexion occurred with good accuracy. Using the real-time observation method as well as the high-speed video analysis performed without determining the exact angle for assessing movement technique were found to be insufficient tools for improving the quality of judging.

  8. Pairwise contact energy statistical potentials can help to find probability of point mutations.

    PubMed

    Saravanan, K M; Suvaithenamudhan, S; Parthasarathy, S; Selvaraj, S

    2017-01-01

    To adopt a particular fold, a protein requires several interactions between its amino acid residues. The energetic contribution of these residue-residue interactions can be approximated by extracting statistical potentials from known high resolution structures. Several methods based on statistical potentials extracted from unrelated proteins are found to make a better prediction of probability of point mutations. We postulate that the statistical potentials extracted from known structures of similar folds with varying sequence identity can be a powerful tool to examine probability of point mutation. By keeping this in mind, we have derived pairwise residue and atomic contact energy potentials for the different functional families that adopt the (α/β) 8 TIM-Barrel fold. We carried out computational point mutations at various conserved residue positions in yeast Triose phosphate isomerase enzyme for which experimental results are already reported. We have also performed molecular dynamics simulations on a subset of point mutants to make a comparative study. The difference in pairwise residue and atomic contact energy of wildtype and various point mutations reveals probability of mutations at a particular position. Interestingly, we found that our computational prediction agrees with the experimental studies of Silverman et al. (Proc Natl Acad Sci 2001;98:3092-3097) and perform better prediction than i Mutant and Cologne University Protein Stability Analysis Tool. The present work thus suggests deriving pairwise contact energy potentials and molecular dynamics simulations of functionally important folds could help us to predict probability of point mutations which may ultimately reduce the time and cost of mutation experiments. Proteins 2016; 85:54-64. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Can We Predict Daily Adherence to Warfarin?

    PubMed Central

    Platt, Alec B.; Localio, A. Russell; Brensinger, Colleen M.; Cruess, Dean G.; Christie, Jason D.; Gross, Robert; Parker, Catherine S.; Price, Maureen; Metlay, Joshua P.; Cohen, Abigail; Newcomb, Craig W.; Strom, Brian L.; Laskin, Mitchell S.

    2010-01-01

    Background: Warfarin is the primary therapy to prevent stroke and venous thromboembolism. Significant periods of nonadherence frequently go unreported by patients and undetected by providers. Currently, no comprehensive screening tool exists to help providers assess the risk of nonadherence at the time of initiation of warfarin therapy. Methods: This article reports on a prospective cohort study of adults initiating warfarin therapy at two anticoagulation clinics (university- and Veterans Affairs-affiliated). Nonadherence, defined by failure to record a correct daily pill bottle opening, was measured daily by electronic pill cap monitoring. A multivariable logistic regression model was used to develop a point system to predict daily nonadherence to warfarin. Results: We followed 114 subjects for a median of 141 days. Median nonadherence of the participants was 14.4% (interquartile range [IQR], 5.8-33.8). A point system, based on nine demographic, clinical, and psychosocial factors, distinguished those demonstrating low vs high levels of nonadherence: four points or fewer, median nonadherence 5.8% (IQR, 2.3-14.1); five points, 9.1% (IQR, 5.9-28.6); six points, 14.5% (IQR, 7.1-24.1); seven points, 14.7% (IQR, 7.0-34.7); and eight points or more, 29.3% (IQR, 15.5-41.9). The model produces a c-statistic of 0.66 (95% CI, 0.61-0.71), suggesting modest discriminating ability to predict day-level warfarin nonadherence. Conclusions: Poor adherence to warfarin is common. A screening tool based on nine demographic, clinical, and psychosocial factors, if further validated in other patient populations, may help to identify groups of patients at lower risk for nonadherence so that intensified efforts at increased monitoring and intervention can be focused on higher-risk patients. PMID:19903973

  10. Reducing tool wear by partial cladding of critical zones in hot form tool by laser metal deposition

    NASA Astrophysics Data System (ADS)

    Vollmer, Robert; Sommitsch, Christof

    2017-10-01

    This paper points out a production method to reduce tool wear in hot stamping applications. Usually tool wear can be observed at locally strongly stressed areas superimposed with gliding movement between blank and tool surface. The shown solution is based on a partial laser cladding of the tool surface with a wear resistant coating to increase the lifespan of tool inserts. Preliminary studies showed good results applying a material combination of tungsten carbide particles embedded in a metallic matrix. Different Nickel based alloys welded on hot work tool steel (1.2343) were tested mechanically in the interface zone. The material with the best bonding characteristic is chosen and reinforced with spherical tungsten carbide particles in a second laser welding step. Since the machining of tungsten carbides is very elaborate a special manufacturing strategy is developed to reduce the milling effort as much as possible. On special test specimens milling tests are carried out to proof the machinability. As outlook a tool insert of a b-pillar is coated to perform real hot forming tests.

  11. Battery Storage Evaluation Tool, version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-02

    The battery storage evaluation tool developed at Pacific Northwest National Laboratory is used to run a one-year simulation to evaluate the benefits of battery storage for multiple grid applications, including energy arbitrage, balancing service, capacity value, distribution system equipment deferral, and outage mitigation. This tool is based on the optimal control strategies to capture multiple services from a single energy storage device. In this control strategy, at each hour, a lookahead optimization is first formulated and solved to determine the battery base operating point. The minute-by-minute simulation is then performed to simulate the actual battery operation.

  12. Toward a new information infrastructure in health technology assessment: communication, design, process, and results.

    PubMed

    Neikter, Susanna Allgurin; Rehnqvist, Nina; Rosén, Måns; Dahlgren, Helena

    2009-12-01

    The aim of this study was to facilitate effective internal and external communication of an international network and to explore how to support communication and work processes in health technology assessment (HTA). STRUCTURE AND METHODS: European network for Health Technology Assessment (EUnetHTA) connected sixty-four HTA Partner organizations from thirty-three countries. User needs in the different steps of the HTA process were the starting point for developing an information system. A step-wise, interdisciplinary, creative approach was used in developing practical tools. An Information Platform facilitated the exchange of scientific information between Partners and with external target groups. More than 200 virtual meetings were set up during the project using an e-meeting tool. A Clearinghouse prototype was developed with the intent to offering a single point of access to HTA relevant information. This evolved into a next step not planned from the outset: Developing a running HTA Information System including several Web-based tools to support communication and daily HTA processes. A communication strategy guided the communication effort, focusing on practical tools, creating added value, involving stakeholders, and avoiding duplication of effort. Modern technology enables a new information infrastructure for HTA. The potential of information and communication technology was used as a strategic tool. Several target groups were represented among the Partners, which supported collaboration and made it easier to identify user needs. A distinctive visual identity made it easier to gain and maintain visibility on a limited budget.

  13. Optimization technique for rolled edge control process based on the acentric tool influence functions.

    PubMed

    Du, Hang; Song, Ci; Li, Shengyi; Xu, Mingjin; Peng, Xiaoqiang

    2017-05-20

    In the process of computer controlled optical surfacing (CCOS), the uncontrollable rolled edge restricts further improvements of the machining accuracy and efficiency. Two reasons are responsible for the rolled edge problem during small tool polishing. One is that the edge areas cannot be processed because of the orbit movement. The other is that changing the tool influence function (TIF) is difficult to compensate for in algorithms, since pressure step appears in the local pressure distribution at the surface edge. In this paper, an acentric tool influence function (A-TIF) is designed to remove the rolled edge after CCOS polishing. The model of A-TIF is analyzed theoretically, and a control point translation dwell time algorithm is used to verify that the full aperture of the workpiece can be covered by the peak removal point of the tool influence functions. Thus, surface residual error in the full aperture can be effectively corrected. Finally, the experiments are carried out. Two fused silica glass samples of 100  mm×100  mm are polished by traditional CCOS and the A-TIF method, respectively. The rolled edge was clearly produced in the sample polished by the traditional CCOS, while residual errors do not show this problem the sample polished by the A-TIF method. Therefore, the rolled edge caused by the traditional CCOS process is successfully suppressed during the A-TIF process. The ability to suppress the rolled edge of the designed A-TIF has been confirmed.

  14. A Novel Mobile Testing Equipment for Rock Cuttability Assessment: Vertical Rock Cutting Rig (VRCR)

    NASA Astrophysics Data System (ADS)

    Yasar, Serdar; Yilmaz, Ali Osman

    2017-04-01

    In this study, a new mobile rock cutting testing apparatus was designed and produced for rock cuttability assessment called vertical rock cutting rig (VRCR) which was designed specially to fit into hydraulic press testing equipment which are available in almost every rock mechanics laboratory. Rock cutting trials were initiated just after the production of VRCR along with calibration of the measuring load cell with an external load cell to validate the recorded force data. Then, controlled rock cutting tests with both relieved and unrelieved cutting modes were implemented on five different volcanic rock samples with a standard simple-shaped wedge tool. Additionally, core cutting test which is an important approach for roadheader performance prediction was simulated with VRCR. Mini disc cutters and point attack tools were used for execution of experimental trials. Results clearly showed that rock cutting tests were successfully realized and measuring system is delicate to rock strength, cutting depth and other variables. Core cutting test was successfully simulated, and it was also shown that rock cutting tests with mini disc cutters and point attack tools are also successful with VRCR.

  15. Assessing primary care in Austria: room for improvement.

    PubMed

    Stigler, Florian L; Starfield, Barbara; Sprenger, Martin; Salzer, Helmut J F; Campbell, Stephen M

    2013-04-01

    There is emerging evidence that strong primary care achieves better health at lower costs. Although primary care can be measured, in many countries, including Austria, there is little understanding of primary care development. Assessing the primary care development in Austria. A primary care assessment tool developed by Barbara Starfield in 1998 was implemented in Austria. This tool defines 15 primary care characteristics and distinguishes between system and practice characteristics. Each characteristic was evaluated by six Austrian primary care experts and rated as 2 (high), 1 (intermediate) or 0 (low) points, respectively, to their primary care strength (maximum score: n = 30). Austria received 7 out of 30 points; no characteristic was rated as '2' but 8 were rated as '0'. Compared with the 13 previously assessed countries, Austria ranks 10th of 14 countries and is classified as a 'low primary care' country. This study provides the first evidence concerning primary care in Austria, benchmarking it as weak and in need of development. The practicable application of an existing assessment tool can be encouraging for other countries to generate evidence about their primary care system as well.

  16. The development procedures and tools applied for the attitude control software of the Italian satellite SAX

    NASA Astrophysics Data System (ADS)

    Hameetman, G. J.; Dekker, G. J.

    1993-11-01

    The Italian satellite (with a large Dutch contribution) SAX is a scientific satellite which has the mission to study roentgen sources. One main requirement for the Attitude and Orbit Control Subsystem (AOCS) is to achieve and maintain a stable pointing accuracy with a limit cycle of less than 90 arcsec during pointings of maximal 28 hours. The main SAX instrument, the Narrow Field Instrument, is highly sensitive to (indirect) radiation coming from the Sun. This sensitivity leads to another main requirement that under no circumstances the safe attitude domain may be left. The paper describes the application software in relation with the overall SAX AOCS subsystem, the CASE tools that have been used during the development, some advantages and disadvantages of the use of these tools, the measures taken to meet the more or less conflicting requirements of reliability and flexibility, and the lessons learned during development. The quality of the approach to the development has proven the (separately executed) hardware/software integration tests. During these tests, a neglectible number of software errors has been detected in the application software.

  17. Optical tools and techniques for aligning solar payloads with the SPARCS control system. [Solar Pointing Aerobee Rocket Control System

    NASA Technical Reports Server (NTRS)

    Thomas, N. L.; Chisel, D. M.

    1976-01-01

    The success of a rocket-borne experiment depends not only on the pointing of the attitude control system, but on the alignment of the attitude control system to the payload. To ensure proper alignment, special optical tools and alignment techniques are required. Those that were used in the SPARCS program are described and discussed herein. These tools include theodolites, autocollimators, a 38-cm diameter solar simulator, a high-performance 1-m heliostat to provide a stable solar source during the integration of the rocket payload, a portable 75-cm sun tracker for use at the launch site, and an innovation called the Solar Alignment Prism. Using the real sun as the primary reference under field conditions, the Solar Alignment Prism facilitates the coalignment of the attitude sun sensor with the payload. The alignment techniques were developed to ensure the precise alignment of the solar payloads to the SPARCS attitude sensors during payload integration and to verify the required alignment under field conditions just prior to launch.

  18. A literature review of the cardiovascular risk-assessment tools: applicability among Asian population.

    PubMed

    Liau, Siow Yen; Mohamed Izham, M I; Hassali, M A; Shafie, A A

    2010-01-01

    Cardiovascular diseases, the main causes of hospitalisations and death globally, have put an enormous economic burden on the healthcare system. Several risk factors are associated with the occurrence of cardiovascular events. At the heart of efficient prevention of cardiovascular disease is the concept of risk assessment. This paper aims to review the available cardiovascular risk-assessment tools and its applicability in predicting cardiovascular risk among Asian populations. A systematic search was performed using keywords as MeSH and Boolean terms. A total of 25 risk-assessment tools were identified. Of these, only two risk-assessment tools (8%) were derived from an Asian population. These risk-assessment tools differ in various ways, including characteristics of the derivation sample, type of study, time frame of follow-up, end points, statistical analysis and risk factors included. Very few cardiovascular risk-assessment tools were developed in Asian populations. In order to accurately predict the cardiovascular risk of our population, there is a need to develop a risk-assessment tool based on local epidemiological data.

  19. Effect of multiple forming tools on geometrical and mechanical properties in incremental sheet forming

    NASA Astrophysics Data System (ADS)

    Wernicke, S.; Dang, T.; Gies, S.; Tekkaya, A. E.

    2018-05-01

    The tendency to a higher variety of products requires economical manufacturing processes suitable for the production of prototypes and small batches. In the case of complex hollow-shaped parts, single point incremental forming (SPIF) represents a highly flexible process. The flexibility of this process comes along with a very long process time. To decrease the process time, a new incremental forming approach with multiple forming tools is investigated. The influence of two incremental forming tools on the resulting mechanical and geometrical component properties compared to SPIF is presented. Sheets made of EN AW-1050A were formed to frustums of a pyramid using different tool-path strategies. Furthermore, several variations of the tool-path strategy are analyzed. A time saving between 40% and 60% was observed depending on the tool-path and the radii of the forming tools while the mechanical properties remained unchanged. This knowledge can increase the cost efficiency of incremental forming processes.

  20. Joint classification and contour extraction of large 3D point clouds

    NASA Astrophysics Data System (ADS)

    Hackel, Timo; Wegner, Jan D.; Schindler, Konrad

    2017-08-01

    We present an effective and efficient method for point-wise semantic classification and extraction of object contours of large-scale 3D point clouds. What makes point cloud interpretation challenging is the sheer size of several millions of points per scan and the non-grid, sparse, and uneven distribution of points. Standard image processing tools like texture filters, for example, cannot handle such data efficiently, which calls for dedicated point cloud labeling methods. It turns out that one of the major drivers for efficient computation and handling of strong variations in point density, is a careful formulation of per-point neighborhoods at multiple scales. This allows, both, to define an expressive feature set and to extract topologically meaningful object contours. Semantic classification and contour extraction are interlaced problems. Point-wise semantic classification enables extracting a meaningful candidate set of contour points while contours help generating a rich feature representation that benefits point-wise classification. These methods are tailored to have fast run time and small memory footprint for processing large-scale, unstructured, and inhomogeneous point clouds, while still achieving high classification accuracy. We evaluate our methods on the semantic3d.net benchmark for terrestrial laser scans with >109 points.

Top