ERIC Educational Resources Information Center
Kauffman, James M.; Birnbrauer, Jay S.
The final report of a project on teaching and management techniques with severely disturbed and/or retarded children presents analysis of single subject research using contingent imitation of the child as an intervention technique. The effects of this technique were examined on the following behaviors: toyplay and reciprocal imitation, self…
ERIC Educational Resources Information Center
Foley, John P., Jr.
A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…
Fostering multiple repertoires in undergraduate behavior analysis students
Polson, David A. D.
1995-01-01
Eight techniques used by the author in teaching an introductory applied behavior analysis course are described: (a) a detailed study guide, (b) frequent tests, (c) composition of practice test questions, (d) in-class study groups, (e) fluency building with a computerized flash-card program, (f) bonus marks for participation during question-and-answer sessions, (g) student presentations that summarize and analyze recently published research, and (h) in-class behavior analysis of comic strips. Together, these techniques require an extensive amount of work by students. Nevertheless, students overwhelmingly prefer this approach to the traditional lecture-midterm-final format, and most earn an A as their final course grade. PMID:22478226
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1977-06-01
The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less
Solid State Audio/Speech Processor Analysis.
1980-03-01
techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD
NASA Technical Reports Server (NTRS)
Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.
1981-01-01
A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Mulware, Stephen Juma
2015-01-01
The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.
The Recoverability of P-Technique Factor Analysis
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
2009-01-01
It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…
Mino, Takuya; Maekawa, Kenji; Ueda, Akihiro; Higuchi, Shizuo; Sejima, Junichi; Takeuchi, Tetsuo; Hara, Emilio Satoshi; Kimura-Ono, Aya; Sonoyama, Wataru; Kuboki, Takuo
2015-04-01
The aim of this article was to investigate the accuracy in the reproducibility of full-arch implant provisional restorations to final restorations between a 3D Scan/CAD/CAM technique and the conventional method. We fabricated two final restorations for rehabilitation of maxillary and mandibular complete edentulous area and performed a computer-based comparative analysis of the accuracy in the reproducibility of the provisional restoration to final restoration between a 3D scanning and CAD/CAM (Scan/CAD/CAM) technique and the conventional silicone-mold transfer technique. Final restorations fabricated either by the conventional or Scan/CAD/CAM method were successfully installed in the patient. The total concave/convex volume discrepancy observed with the Scan/CAD/CAM technique was 503.50mm(3) and 338.15 mm(3) for maxillary and mandibular implant-supported prostheses (ISPs), respectively. On the other hand, total concave/convex volume discrepancy observed with the conventional method was markedly high (1106.84 mm(3) and 771.23 mm(3) for maxillary and mandibular ISPs, respectively). The results of the present report suggest that Scan/CAD/CAM method enables a more precise and accurate transfer of provisional restorations to final restorations compared to the conventional method. Copyright © 2014 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya
2018-07-01
In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.
Flood frequency analysis using optimization techniques : final report.
DOT National Transportation Integrated Search
1992-10-01
this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...
Strain-energy release rate analysis of a laminate with a postbuckled delamination
NASA Technical Reports Server (NTRS)
Whitcomb, John D.; Shivakumar, K. N.
1987-01-01
The objectives are to present the derivation of the new virtual crack closure technique, evaluate the accuracy of the technique, and finally to present the results of a limited parametric study of laminates with a postbuckled delamination. Although the new virtual crack closure technique is general, only homogeneous, isotropic laminates were analyzed. This was to eliminate the variation of flexural stiffness with orientation, which occurs even for quasi-isotropic laminates. This made it easier to identify the effect of geometrical parameters on G. The new virtual crack closure technique is derived. Then the specimen configurations are described. Next, the stress analyses is discussed. Finally, the virtual crack closure technique is evaluated and then used to calculate the distribution of G along the delamination front of several laminates with a postbuckled delamination.
DOT National Transportation Integrated Search
1998-09-14
A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...
Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.
Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E
2017-01-01
Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.
Mathematical analysis techniques for modeling the space network activities
NASA Technical Reports Server (NTRS)
Foster, Lisa M.
1992-01-01
The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.
Evaluation of Uranium-235 Measurement Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaspar, Tiffany C.; Lavender, Curt A.; Dibert, Mark W.
2017-05-23
Monolithic U-Mo fuel plates are rolled to final fuel element form from the original cast ingot, and thus any inhomogeneities in 235U distribution present in the cast ingot are maintained, and potentially exaggerated, in the final fuel foil. The tolerance for inhomogeneities in the 235U concentration in the final fuel element foil is very low. A near-real-time, nondestructive technique to evaluate the 235U distribution in the cast ingot is required in order to provide feedback to the casting process. Based on the technical analysis herein, gamma spectroscopy has been recommended to provide a near-real-time measure of the 235U distribution inmore » U-Mo cast plates.« less
Critical evaluation of sample pretreatment techniques.
Hyötyläinen, Tuulia
2009-06-01
Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
Investigating cardiorespiratory interaction by cross-spectral analysis of event series
NASA Astrophysics Data System (ADS)
Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen
2000-02-01
The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.
A Comparative of business process modelling techniques
NASA Astrophysics Data System (ADS)
Tangkawarow, I. R. H. T.; Waworuntu, J.
2016-04-01
In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.
Determination of shell content by activation analysis : final report.
DOT National Transportation Integrated Search
1978-08-01
The objective of this study is to determine if neutron activation analysis technique, developed under Research Project 70-1ST, can be used to determine the shell content of a sand-shell mixture. : In order to accomplish this objective, samples of san...
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
Estimation for bilinear stochastic systems
NASA Technical Reports Server (NTRS)
Willsky, A. S.; Marcus, S. I.
1974-01-01
Three techniques for the solution of bilinear estimation problems are presented. First, finite dimensional optimal nonlinear estimators are presented for certain bilinear systems evolving on solvable and nilpotent lie groups. Then the use of harmonic analysis for estimation problems evolving on spheres and other compact manifolds is investigated. Finally, an approximate estimation technique utilizing cumulants is discussed.
Enamel paint techniques in archaeology and their identification using XRF and micro-XRF
NASA Astrophysics Data System (ADS)
Hložek, M.; Trojek, T.; Komoróczy, B.; Prokeš, R.
2017-08-01
This investigation focuses in detail on the analysis of discoveries in South Moravia - important sites from the Roman period in Pasohlávky and Mušov. Using X-ray fluorescence analysis and micro-analysis we help identify the techniques of enamel paint and give a thorough chemical analysis in details which would not be possible to determine by means of macroscopic examination. We thus address the influence of elemental composition on the final colour of the enamel paint and describe the less known technique of combining enamel with millefiori. The material analyses of the metal artefacts decorated with enamel paint significantly contribute to our knowledge of the technology being used during the Roman period.
ERIC Educational Resources Information Center
Crawshaw, Robert
An examination of the principles and techniques of oral testing in British university-level final examinations in modern languages discusses: (1) the shortcomings of present oral testing procedures; (2) the theoretical controversy surrounding the design and value of oral proficiency tests, arising from research in English as a second language…
Boosting Higgs pair production in the [Formula: see text] final state with multivariate techniques.
Behr, J Katharina; Bortoletto, Daniela; Frost, James A; Hartland, Nathan P; Issever, Cigdem; Rojo, Juan
2016-01-01
The measurement of Higgs pair production will be a cornerstone of the LHC program in the coming years. Double Higgs production provides a crucial window upon the mechanism of electroweak symmetry breaking and has a unique sensitivity to the Higgs trilinear coupling. We study the feasibility of a measurement of Higgs pair production in the [Formula: see text] final state at the LHC. Our analysis is based on a combination of traditional cut-based methods with state-of-the-art multivariate techniques. We account for all relevant backgrounds, including the contributions from light and charm jet mis-identification, which are ultimately comparable in size to the irreducible 4 b QCD background. We demonstrate the robustness of our analysis strategy in a high pileup environment. For an integrated luminosity of [Formula: see text] ab[Formula: see text], a signal significance of [Formula: see text] is obtained, indicating that the [Formula: see text] final state alone could allow for the observation of double Higgs production at the High Luminosity LHC.
Preliminary assessment of aerial photography techniques for canvasback population analysis
Munro, R.E.; Trauger, D.L.
1976-01-01
Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.
GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration
Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng
2015-01-01
The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315
Web-Based Trainer for Electrical Circuit Analysis
ERIC Educational Resources Information Center
Weyten, L.; Rombouts, P.; De Maeyer, J.
2009-01-01
A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…
An overview of data acquisition, signal coding and data analysis techniques for MST radars
NASA Technical Reports Server (NTRS)
Rastogi, P. K.
1986-01-01
An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.
Wing download reduction using vortex trapping plates
NASA Technical Reports Server (NTRS)
Light, Jeffrey S.; Stremel, Paul M.; Bilanin, Alan J.
1994-01-01
A download reduction technique using spanwise plates on the upper and lower wing surfaces has been examined. Experimental and analytical techniques were used to determine the download reduction obtained using this technique. Simple two-dimensional wind tunnel testing confirmed the validity of the technique for reducing two-dimensional airfoil drag. Computations using a two-dimensional Navier-Stokes analysis provided insight into the mechanism causing the drag reduction. Finally, the download reduction technique was tested using a rotor and wing to determine the benefits for a semispan configuration representative of a tilt rotor aircraft.
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
The combined use of order tracking techniques for enhanced Fourier analysis of order components
NASA Astrophysics Data System (ADS)
Wang, K. S.; Heyns, P. S.
2011-04-01
Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.
Automated analysis and classification of melanocytic tumor on skin whole slide images.
Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal
2018-06-01
This paper presents a computer-aided technique for automated analysis and classification of melanocytic tumor on skin whole slide biopsy images. The proposed technique consists of four main modules. First, skin epidermis and dermis regions are segmented by a multi-resolution framework. Next, epidermis analysis is performed, where a set of epidermis features reflecting nuclear morphologies and spatial distributions is computed. In parallel with epidermis analysis, dermis analysis is also performed, where dermal cell nuclei are segmented and a set of textural and cytological features are computed. Finally, the skin melanocytic image is classified into different categories such as melanoma, nevus or normal tissue by using a multi-class support vector machine (mSVM) with extracted epidermis and dermis features. Experimental results on 66 skin whole slide images indicate that the proposed technique achieves more than 95% classification accuracy, which suggests that the technique has the potential to be used for assisting pathologists on skin biopsy image analysis and classification. Copyright © 2018 Elsevier Ltd. All rights reserved.
Column-coupling strategies for multidimensional electrophoretic separation techniques.
Kler, Pablo A; Sydes, Daniel; Huhn, Carolin
2015-01-01
Multidimensional electrophoretic separations represent one of the most common strategies for dealing with the analysis of complex samples. In recent years we have been witnessing the explosive growth of separation techniques for the analysis of complex samples in applications ranging from life sciences to industry. In this sense, electrophoretic separations offer several strategic advantages such as excellent separation efficiency, different methods with a broad range of separation mechanisms, and low liquid consumption generating less waste effluents and lower costs per analysis, among others. Despite their impressive separation efficiency, multidimensional electrophoretic separations present some drawbacks that have delayed their extensive use: the volumes of the columns, and consequently of the injected sample, are significantly smaller compared to other analytical techniques, thus the coupling interfaces between two separations components must be very efficient in terms of providing geometrical precision with low dead volume. Likewise, very sensitive detection systems are required. Additionally, in electrophoretic separation techniques, the surface properties of the columns play a fundamental role for electroosmosis as well as the unwanted adsorption of proteins or other complex biomolecules. In this sense the requirements for an efficient coupling for electrophoretic separation techniques involve several aspects related to microfluidics and physicochemical interactions of the electrolyte solutions and the solid capillary walls. It is interesting to see how these multidimensional electrophoretic separation techniques have been used jointly with different detection techniques, for intermediate detection as well as for final identification and quantification, particularly important in the case of mass spectrometry. In this work we present a critical review about the different strategies for coupling two or more electrophoretic separation techniques and the different intermediate and final detection methods implemented for such separations.
Sonic Fatigue Design Techniques for Advanced Composite Aircraft Structures
1980-04-01
AFWAL-TR-80.3019 AD A 090553 SONIC FATIGUE DESIGN TECHNIQUES FOR ADVANCED COMPOSITE AIRCRAFT STRUCTURES FINAL REPORT Ian Holehouse Rohr Industries...5 2. General Sonic Fatigue Theory .... ....... 7 3. Composite Laminate Analysis .. ....... ... 10 4. Preliminary Sonic Fatigue...overall sonic fatigue design guides. These existing desiyn methcds have been developed for metal structures. However, recent advanced composite
ERIC Educational Resources Information Center
Rimoldi, Horacio J. A.
The study of problem solving is made through the analysis of the process that leads to the final answer. The type of information obtained through the study of the process is compared with the information obtained by studying the final answer. The experimental technique used permits to identify the sequence of questions (tactics) that subjects ask…
ERIC Educational Resources Information Center
McKey, Ruth Hubbell; And Others
Including all Head Start research (both published and unpublished) and using, when possible, the statistical technique of meta-analysis, this final report of the Head Start Evaluation, Synthesis, and Utilization Project presents findings on the impact of Head Start on children's cognitive and socioemotional development, on child health and health…
SEM Analysis Techniques for LSI Microcircuits. Volume 2
1980-08-01
4~, 1 v’ ’ RADC-TR80-250, Vol 11 (of two), Final Technical -Report, Augut1980 SEM, ANALYSIS TECHNIQUES, FOR LSI MICROCIRCUITS: ’Martin...Bit Static ’RAM.. Volume II - 1024 Bit Stat’i RAM, 4096 Bit Dynamic RAM (SiGATE WOS,)., 4096 Bit -Dynamic RAM ( 1 2 L Bipolar)., ,Summary. RADC-TR-80-250...States, ithout.irst obtani an export nse, is a violation t Internatio 1 Tr ffic in A . eguiations. Such violation is subject o penalty of to 2 years impr
Sandra, Koen; Moshir, Mahan; D'hondt, Filip; Tuytten, Robin; Verleysen, Katleen; Kas, Koen; François, Isabelle; Sandra, Pat
2009-04-15
Multidimensional liquid-based separation techniques are described for maximizing the resolution of the enormous number of peptides generated upon tryptic digestion of proteomes, and hence, reduce the spatial and temporal complexity of the sample to a level that allows successful mass spectrometric analysis. This review complements the previous contribution on unidimensional high performance liquid chromatography (HPLC). Both chromatography and electrophoresis will be discussed albeit with reversed-phase HPLC (RPLC) as the final separation dimension prior to MS analysis.
Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lee Kenneth
2017-03-01
This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.
Flow analysis techniques for phosphorus: an overview.
Estela, José Manuel; Cerdà, Víctor
2005-04-15
A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.
Post-test navigation data analysis techniques for the shuttle ALT
NASA Technical Reports Server (NTRS)
1975-01-01
Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.
Thomas, Anchu Rachel; Velmurugan, Natanasabapathy; Smita, Surendran; Jothilatha, Sundaramurthy
2014-10-01
The purpose of this study was to evaluate the canal isthmus debridement efficacy of a new modified EndoVac (Discus Dental, Culver City, CA) irrigation protocol in comparison with EndoVac, passive ultrasonic irrigation (PUI), and conventional needle irrigation in mesial roots of mandibular molars. The mesial roots of 64 extracted mandibular molars mounted in resin using Kuttler's endodontic cube, sectioned at 2 and 4 mm from the working length, were randomly divided into 4 groups (n = 16): group 1: Max-I-Probe (Dentsply Tulsa Dental, York, PA), group 2: EndoVac (EVI), group 3: modified EndoVac, and group 4: PUI. The specimens were reassembled and instrumented. A standard irrigation protocol was used during cleaning and shaping and final irrigation with the 4 irrigation/agitation techniques. Images of the isthmus region were taken before and after cleaning and shaping and after final irrigation. The percentage reduction of debris in the isthmus region was calculated by using the software program Image J (v1.43; National Institutes of Health, Bethesda, MD). Intergroup analysis was performed using the Kruskal Wallis and Mann-Whitney U tests. Intragroup analysis was performed using Friedman and Wilcoxon signed rank tests. The level of significance was set at P < .05. Intragroup analysis revealed a statistically significant difference in the percentage reduction of debris after cleaning and shaping and after final irrigation protocol in all the groups (P < .001). The final irrigation protocol produced significantly cleaner canal isthmuses in all the groups (P < .001). On intergroup analysis, the modified EVI group performed significantly better than the other groups. The EVI and PUI groups performed better than the Max-I-Probe group. There was no statistical significance between the EVI and PUI groups. Canal isthmuses were significantly cleaner with the modified EndoVac irrigation technique when compared with the cleanliness seen with the other irrigation systems. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Silverman, Mitchell
Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…
NASA Technical Reports Server (NTRS)
Balas, Gary J.
1996-01-01
This final report summarizes the research results under NASA Contract NAG-1-1254 from May, 1991 - April, 1995. The main contribution of this research are in the areas of control of flexible structures, model validation, optimal control analysis and synthesis techniques, and use of shape memory alloys for structural damping.
Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones
2015-07-01
COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in
In vivo stationary flux analysis by 13C labeling experiments.
Wiechert, W; de Graaf, A A
1996-01-01
Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.
Computed tomography for non-destructive evaluation of composites: Applications and correlations
NASA Technical Reports Server (NTRS)
Goldberg, B.; Hediger, L.; Noel, E.
1985-01-01
The state-of-the-art fabrication techniques for composite materials are such that stringent species-specific acceptance criteria must be generated to insure product reliability. Non-destructive evaluation techniques including computed tomography (CT), X-ray radiography (RT), and ultrasonic scanning (UT) are investigated and compared to determine their applicability and limitations to graphite epoxy, carbon-carbon, and carbon-phenolic materials. While the techniques appear complementary, CT is shown to provide significant, heretofore unattainable data. Finally, a correlation of NDE techniques to destructive analysis is presented.
NASA Astrophysics Data System (ADS)
Taylor, Stephen R.; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through "Bayesian model-emulation". We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of the spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Taylor, Stephen; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through ``Bayesian model-emulation''. We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa
2014-11-06
Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
An Approach Based on Social Network Analysis Applied to a Collaborative Learning Experience
ERIC Educational Resources Information Center
Claros, Iván; Cobos, Ruth; Collazos, César A.
2016-01-01
The Social Network Analysis (SNA) techniques allow modelling and analysing the interaction among individuals based on their attributes and relationships. This approach has been used by several researchers in order to measure the social processes in collaborative learning experiences. But oftentimes such measures were calculated at the final state…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1976-08-05
During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)
A Meta-Analysis of Previous Research on the Treatment of Hyperactivity. Final Report.
ERIC Educational Resources Information Center
White, Karl R.; And Others
Using meta-analysis techniques, the study sought to identify, integrate, and synthesize the literature from 61 articles which review the efficacy of various treatments for hyperactive children. The major objectives were to determine if drugs can be used effectively with hyperactive children, what child and intervention characteristics covary with…
Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar
2018-01-15
Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bullying in Virtual Learning Communities.
Nikiforos, Stefanos; Tzanavaris, Spyros; Kermanidis, Katia Lida
2017-01-01
Bullying through the internet has been investigated and analyzed mainly in the field of social media. In this paper, it is attempted to analyze bullying in the Virtual Learning Communities using Natural Language Processing (NLP) techniques, mainly in the context of sociocultural learning theories. Therefore four case studies took place. We aim to apply NLP techniques to speech analysis on communication data of online communities. Emphasis is given on qualitative data, taking into account the subjectivity of the collaborative activity. Finally, this is the first time such type of analysis is attempted on Greek data.
NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1
NASA Technical Reports Server (NTRS)
Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)
1990-01-01
The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.
NASA Technical Reports Server (NTRS)
Wheeler, D. R.
1978-01-01
The principles of ESCA (electron spectroscopy for chemical analysis) are described by comparison with other spectroscopic techniques. The advantages and disadvantages of ESCA as compared to other surface sensitive analytical techniques are evaluated. The use of ESCA is illustrated by actual applications to oxidation of steel and Rene 41, the chemistry of lubricant additives on steel, and the composition of sputter deposited hard coatings. Finally, a bibliography of material that is useful for further study of ESCA is presented and commented upon.
Evaluation of methods for rapid determination of freezing point of aviation fuels
NASA Technical Reports Server (NTRS)
Mathiprakasam, B.
1982-01-01
Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.
Page layout analysis and classification for complex scanned documents
NASA Astrophysics Data System (ADS)
Erkilinc, M. Sezer; Jaber, Mustafa; Saber, Eli; Bauer, Peter; Depalov, Dejan
2011-09-01
A framework for region/zone classification in color and gray-scale scanned documents is proposed in this paper. The algorithm includes modules for extracting text, photo, and strong edge/line regions. Firstly, a text detection module which is based on wavelet analysis and Run Length Encoding (RLE) technique is employed. Local and global energy maps in high frequency bands of the wavelet domain are generated and used as initial text maps. Further analysis using RLE yields a final text map. The second module is developed to detect image/photo and pictorial regions in the input document. A block-based classifier using basis vector projections is employed to identify photo candidate regions. Then, a final photo map is obtained by applying probabilistic model based on Markov random field (MRF) based maximum a posteriori (MAP) optimization with iterated conditional mode (ICM). The final module detects lines and strong edges using Hough transform and edge-linkages analysis, respectively. The text, photo, and strong edge/line maps are combined to generate a page layout classification of the scanned target document. Experimental results and objective evaluation show that the proposed technique has a very effective performance on variety of simple and complex scanned document types obtained from MediaTeam Oulu document database. The proposed page layout classifier can be used in systems for efficient document storage, content based document retrieval, optical character recognition, mobile phone imagery, and augmented reality.
The Utility of Template Analysis in Qualitative Psychology Research.
Brooks, Joanna; McCluskey, Serena; Turley, Emma; King, Nigel
2015-04-03
Thematic analysis is widely used in qualitative psychology research, and in this article, we present a particular style of thematic analysis known as Template Analysis. We outline the technique and consider its epistemological position, then describe three case studies of research projects which employed Template Analysis to illustrate the diverse ways it can be used. Our first case study illustrates how the technique was employed in data analysis undertaken by a team of researchers in a large-scale qualitative research project. Our second example demonstrates how a qualitative study that set out to build on mainstream theory made use of the a priori themes (themes determined in advance of coding) permitted in Template Analysis. Our final case study shows how Template Analysis can be used from an interpretative phenomenological stance. We highlight the distinctive features of this style of thematic analysis, discuss the kind of research where it may be particularly appropriate, and consider possible limitations of the technique. We conclude that Template Analysis is a flexible form of thematic analysis with real utility in qualitative psychology research.
A study of data coding technology developments in the 1980-1985 time frame, volume 2
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Shahsavari, M. M.
1978-01-01
The source parameters of digitized analog data are discussed. Different data compression schemes are outlined and analysis of their implementation are presented. Finally, bandwidth compression techniques are given for video signals.
NASA Technical Reports Server (NTRS)
Behbehani, K.
1980-01-01
A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.
Analysis of Some Potential Manpower Policies for the All-Volunteer Navy. Final Report.
ERIC Educational Resources Information Center
Battelle, R. Bard; And Others
This report describes an analysis of Navy personnel as a subsystem of the Navy, functioning with the overall objective of maintaining Fleet readiness within the constraints of budget and manpower supply limitations. Manpower utilization and management techniques and options were examined and evaluated for their usefulness to an all volunteer Navy…
ERIC Educational Resources Information Center
Freeman, Robert R.; And Others
The main results of the survey-and-analysis stage include a substantial collection of preliminary data on the language-sciences information user community, its professional specialties and information channels, its indexing tools, and its terminologies. The prospects and techniques for the development of a modern, discipline-based information…
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
Grindlay, Guillermo; Mora, Juan; Gras, Luis; de Loos-Vollebregt, Margaretha T C
2011-04-08
The analysis of wine is of great importance since wine components strongly determine its stability, organoleptic or nutrition characteristics. In addition, wine analysis is also important to prevent fraud and to assess toxicological issues. Among the different analytical techniques described in the literature, atomic spectrometry has been traditionally employed for elemental wine analysis due to its simplicity and good analytical figures of merit. The scope of this review is to summarize the main advantages and drawbacks of various atomic spectrometry techniques for elemental wine analysis. Special attention is paid to interferences (i.e. matrix effects) affecting the analysis as well as the strategies available to mitigate them. Finally, latest studies about wine speciation are briefly discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
A comparative analysis of soft computing techniques for gene prediction.
Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand
2013-07-01
The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.
A systematic comparison of the closed shoulder reduction techniques.
Alkaduhimi, H; van der Linde, J A; Willigenburg, N W; van Deurzen, D F P; van den Bekerom, M P J
2017-05-01
To identify the optimal technique for closed reduction for shoulder instability, based on success rates, reduction time, complication risks, and pain level. A PubMed and EMBASE query was performed, screening all relevant literature of closed reduction techniques mentioning the success rate written in English, Dutch, German, and Arabic. Studies with a fracture dislocation or lacking information on success rates for closed reduction techniques were excluded. We used the modified Coleman Methodology Score (CMS) to assess the quality of included studies and excluded studies with a poor methodological quality (CMS < 50). Finally, a meta-analysis was performed on the data from all studies combined. 2099 studies were screened for their title and abstract, of which 217 studies were screened full-text and finally 13 studies were included. These studies included 9 randomized controlled trials, 2 retrospective comparative studies, and 2 prospective non-randomized comparative studies. A combined analysis revealed that the scapular manipulation is the most successful (97%), fastest (1.75 min), and least painful reduction technique (VAS 1,47); the "Fast, Reliable, and Safe" (FARES) method also scores high in terms of successful reduction (92%), reduction time (2.24 min), and intra-reduction pain (VAS 1.59); the traction-countertraction technique is highly successful (95%), but slower (6.05 min) and more painful (VAS 4.75). For closed reduction of anterior shoulder dislocations, the combined data from the selected studies indicate that scapular manipulation is the most successful and fastest technique, with the shortest mean hospital stay and least pain during reduction. The FARES method seems the best alternative.
A study of space shuttle structural integrity test and assessment. Part 1
NASA Technical Reports Server (NTRS)
Anderson, R. E.; Poe, R. G.
1972-01-01
The ultrasonics technique for assessing the structural integrity of the primary surface of the space shuttle vehicles is discussed and evaluated. Analysis was made of transducers, transducer coupling test structure fabrication, flaws, and ultrasonic testing. Graphs of microphone response curves from the initial noise tests, accelerometer response curves from the final noise tests, and microphone curves from the final noise tests are included along with a glossary, bibliography, and results.
Study of fault tolerant software technology for dynamic systems
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Zacharias, G. L.
1985-01-01
The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.
Application of an enriched FEM technique in thermo-mechanical contact problems
NASA Astrophysics Data System (ADS)
Khoei, A. R.; Bahmani, B.
2018-02-01
In this paper, an enriched FEM technique is employed for thermo-mechanical contact problem based on the extended finite element method. A fully coupled thermo-mechanical contact formulation is presented in the framework of X-FEM technique that takes into account the deformable continuum mechanics and the transient heat transfer analysis. The Coulomb frictional law is applied for the mechanical contact problem and a pressure dependent thermal contact model is employed through an explicit formulation in the weak form of X-FEM method. The equilibrium equations are discretized by the Newmark time splitting method and the final set of non-linear equations are solved based on the Newton-Raphson method using a staggered algorithm. Finally, in order to illustrate the capability of the proposed computational model several numerical examples are solved and the results are compared with those reported in literature.
NASA Technical Reports Server (NTRS)
Bown, R. L.; Christofferson, A.; Lardas, M.; Flanders, H.
1980-01-01
A lambda matrix solution technique is being developed to perform an open loop frequency analysis of a high order dynamic system. The procedure evaluates the right and left latent vectors corresponding to the respective latent roots. The latent vectors are used to evaluate the partial fraction expansion formulation required to compute the flexible body open loop feedback gains for the Space Shuttle Digital Ascent Flight Control System. The algorithm is in the final stages of development and will be used to insure that the feedback gains meet the design specification.
Parametric Robust Control and System Identification: Unified Approach
NASA Technical Reports Server (NTRS)
Keel, L. H.
1996-01-01
During the period of this support, a new control system design and analysis method has been studied. This approach deals with control systems containing uncertainties that are represented in terms of its transfer function parameters. Such a representation of the control system is common and many physical parameter variations fall into this type of uncertainty. Techniques developed here are capable of providing nonconservative analysis of such control systems with parameter variations. We have also developed techniques to deal with control systems when their state space representations are given rather than transfer functions. In this case, the plant parameters will appear as entries of state space matrices. Finally, a system modeling technique to construct such systems from the raw input - output frequency domain data has been developed.
Structure identification methods for atomistic simulations of crystalline materials
Stukowski, Alexander
2012-05-28
Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.
Forensic Analysis of Digital Image Tampering
2004-12-01
analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...Figure 3.11 – Algorithm for JPEG Block Technique ....................................................... 54 Figure 3.12 – “Forged” Image with Result
Ambiguities in model-independent partial-wave analysis
NASA Astrophysics Data System (ADS)
Krinner, F.; Greenwald, D.; Ryabchikov, D.; Grube, B.; Paul, S.
2018-06-01
Partial-wave analysis is an important tool for analyzing large data sets in hadronic decays of light and heavy mesons. It commonly relies on the isobar model, which assumes multihadron final states originate from successive two-body decays of well-known undisturbed intermediate states. Recently, analyses of heavy-meson decays and diffractively produced states have attempted to overcome the strong model dependences of the isobar model. These analyses have overlooked that model-independent, or freed-isobar, partial-wave analysis can introduce mathematical ambiguities in results. We show how these ambiguities arise and present general techniques for identifying their presence and for correcting for them. We demonstrate these techniques with specific examples in both heavy-meson decay and pion-proton scattering.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1995-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.
Summit Station Skiway Cost Analysis
2016-07-01
Laboratory (CRREL) U.S. Army Engineer Research and Development Center (ERDC) 72 Lyme Road Hanover, NH 03755-1290 Final Report Approved for...cargo loads. To explore further skiway improvement and cost saving techniques, this report reviews alternative maintenance and construction options...3 2.2 Maintenance
Liver Procurement for Orthotopic Transplantation: An Analysis of the Pittsburgh Experience
Van Thiel, David H.; Schade, Robert R.; Hakala, Thomas R.; Starzl, Thomas E.; Denny, Donald
2010-01-01
The incidence of prospective organ donors in the United States and the techniques which are to used to guarantee their optimal use after identification are analyzed. Attitudes of the public and health professionals toward organ donation are discussed. The organization of the Pittsburgh Organ Procurement Agency and its relationship to other such agencies is described. Finally, the presently used techniques of liver salvaging and preservation are outlined. PMID:6363261
NASA Technical Reports Server (NTRS)
Landgrebe, D.
1974-01-01
A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.
Development and application of the maximum entropy method and other spectral estimation techniques
NASA Astrophysics Data System (ADS)
King, W. R.
1980-09-01
This summary report is a collection of four separate progress reports prepared under three contracts, which are all sponsored by the Office of Naval Research in Arlington, Virginia. This report contains the results of investigations into the application of the maximum entropy method (MEM), a high resolution, frequency and wavenumber estimation technique. The report also contains a description of two, new, stable, high resolution spectral estimation techniques that is provided in the final report section. Many examples of wavenumber spectral patterns for all investigated techniques are included throughout the report. The maximum entropy method is also known as the maximum entropy spectral analysis (MESA) technique, and both names are used in the report. Many MEM wavenumber spectral patterns are demonstrated using both simulated and measured radar signal and noise data. Methods for obtaining stable MEM wavenumber spectra are discussed, broadband signal detection using the MEM prediction error transform (PET) is discussed, and Doppler radar narrowband signal detection is demonstrated using the MEM technique. It is also shown that MEM cannot be applied to randomly sampled data. The two new, stable, high resolution, spectral estimation techniques discussed in the final report section, are named the Wiener-King and the Fourier spectral estimation techniques. The two new techniques have a similar derivation based upon the Wiener prediction filter, but the two techniques are otherwise quite different. Further development of the techniques and measurement of the technique spectral characteristics is recommended for subsequent investigation.
A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.
Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi
2016-10-01
We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.
Ambient ionisation mass spectrometry for in situ analysis of intact proteins
Kocurek, Klaudia I.; Griffiths, Rian L.
2018-01-01
Abstract Ambient surface mass spectrometry is an emerging field which shows great promise for the analysis of biomolecules directly from their biological substrate. In this article, we describe ambient ionisation mass spectrometry techniques for the in situ analysis of intact proteins. As a broad approach, the analysis of intact proteins offers unique advantages for the determination of primary sequence variations and posttranslational modifications, as well as interrogation of tertiary and quaternary structure and protein‐protein/ligand interactions. In situ analysis of intact proteins offers the potential to couple these advantages with information relating to their biological environment, for example, their spatial distributions within healthy and diseased tissues. Here, we describe the techniques most commonly applied to in situ protein analysis (liquid extraction surface analysis, continuous flow liquid microjunction surface sampling, nano desorption electrospray ionisation, and desorption electrospray ionisation), their advantages, and limitations and describe their applications to date. We also discuss the incorporation of ion mobility spectrometry techniques (high field asymmetric waveform ion mobility spectrometry and travelling wave ion mobility spectrometry) into ambient workflows. Finally, future directions for the field are discussed. PMID:29607564
Schrader, I; Wilk, D; Jansen, O; Riedel, C
2013-09-01
To evaluate how accurately final infarct volume in acute ischemic stroke can be predicted with perfusion CT (PCT) using a 64-MDCT unit and the toggling table technique. Retrospective analysis of 89 patients with acute ischemic stroke who underwent CCT, CT angiography (CTA) and PCT using the "toggling table" technique within the first three hours after symptom onset. In patients with successful thrombolytic therapy (n = 48) and in those without effective thrombolytic therapy (n = 41), the infarct volume and the volume of the penumbra on PCT were compared to the infarct size on follow-up images (CT or MRI) performed within 8 days. The feasibility of complete infarct volume prediction by 8 cm cranio-caudal coverage was evaluated. The correlation between the volume of hypoperfusion on PCT defined by cerebral blood volume reduction and final infarct volume was strongest in patients with successful thrombolytic therapy with underestimation of the definite infarct volume by 8.5 ml on average. The CBV map had the greatest prognostic value. In patients without successful thrombolytic therapy, the final infarct volume was overestimated by 12.1 ml compared to the MTT map on PCT. All infarcts were detected completely. There were no false-positive or false-negative results. Using PCT and the "toggling table" technique in acute stroke patients is helpful for the rapid and accurate quantification of the minimal final infarct and is therefore a prognostic parameter which has to be evaluated in further studies to assess its impact on therapeutic decision. ▶ Using PCT and the “toggling table technique” allows accurate quantification of the infarct core and penumbra. ▶ It is possible to record dynamic perfusion parameters quickly and easily of almost the entire supratentorial brain volume on a 64-slice MDCT unit. ▶ The technique allows identification of those patients who could profit from thrombolytic therapy outside the established time intervals. © Georg Thieme Verlag KG Stuttgart · New York.
Analysis of 3D printing parameters of gears for hybrid manufacturing
NASA Astrophysics Data System (ADS)
Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz
2018-05-01
The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.
Nuclear analytical techniques in medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesareo, R.
1988-01-01
This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less
Detection of Gunshot Residues Using Mass Spectrometry
Blanes, Lucas; Cole, Nerida; Doble, Philip; Roux, Claude
2014-01-01
In recent years, forensic scientists have become increasingly interested in the detection and interpretation of organic gunshot residues (OGSR) due to the increasing use of lead- and heavy metal-free ammunition. This has also been prompted by the identification of gunshot residue- (GSR-) like particles in environmental and occupational samples. Various techniques have been investigated for their ability to detect OGSR. Mass spectrometry (MS) coupled to a chromatographic system is a powerful tool due to its high selectivity and sensitivity. Further, modern MS instruments can detect and identify a number of explosives and additives which may require different ionization techniques. Finally, MS has been applied to the analysis of both OGSR and inorganic gunshot residue (IGSR), although the “gold standard” for analysis is scanning electron microscopy with energy dispersive X-ray microscopy (SEM-EDX). This review presents an overview of the technical attributes of currently available MS and ionization techniques and their reported applications to GSR analysis. PMID:24977168
Sisco, Edward; Demoranville, Leonard T; Gillen, Greg
2013-09-10
The feasibility of using C60(+) cluster primary ion bombardment secondary ion mass spectrometry (C60(+) SIMS) for the analysis of the chemical composition of fingerprints is evaluated. It was found that C60(+) SIMS could be used to detect and image the spatial localization of a number of sebaceous and eccrine components in fingerprints. These analyses were also found to not be hindered by the use of common latent print powder development techniques. Finally, the ability to monitor the depth distribution of fingerprint constituents was found to be possible - a capability which has not been shown using other chemical imaging techniques. This paper illustrates a number of strengths and potential weaknesses of C60(+) SIMS as an additional or complimentary technique for the chemical analysis of fingerprints. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Distributed intelligent data analysis in diabetic patient management.
Bellazzi, R.; Larizza, C.; Riva, A.; Mira, A.; Fiocchi, S.; Stefanelli, M.
1996-01-01
This paper outlines the methodologies that can be used to perform an intelligent analysis of diabetic patients' data, realized in a distributed management context. We present a decision-support system architecture based on two modules, a Patient Unit and a Medical Unit, connected by telecommunication services. We stress the necessity to resort to temporal abstraction techniques, combined with time series analysis, in order to provide useful advice to patients; finally, we outline how data analysis and interpretation can be cooperatively performed by the two modules. PMID:8947655
NASA Astrophysics Data System (ADS)
Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.
2013-07-01
The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
Escobar Galindo, Ramón; Gago, Raul; Duday, David; Palacio, Carlos
2010-04-01
An increasing amount of effort is currently being directed towards the development of new functionalized nanostructured materials (i.e., multilayers and nanocomposites). Using an appropriate combination of composition and microstructure, it is possible to optimize and tailor the final properties of the material to its final application. The analytical characterization of these new complex nanostructures requires high-resolution analytical techniques that are able to provide information about surface and depth composition at the nanometric level. In this work, we comparatively review the state of the art in four different depth-profiling characterization techniques: Rutherford backscattering spectroscopy (RBS), secondary ion mass spectrometry (SIMS), X-ray photoelectron spectroscopy (XPS) and glow discharge optical emission spectroscopy (GDOES). In addition, we predict future trends in these techniques regarding improvements in their depth resolutions. Subnanometric resolution can now be achieved in RBS using magnetic spectrometry systems. In SIMS, the use of rotating sample holders and oxygen flooding during analysis as well as the optimization of floating low-energy ion guns to lower the impact energy of the primary ions improves the depth resolution of the technique. Angle-resolved XPS provides a very powerful and nondestructive technique for obtaining depth profiling and chemical information within the range of a few monolayers. Finally, the application of mathematical tools (deconvolution algorithms and a depth-profiling model), pulsed sources and surface plasma cleaning procedures is expected to greatly improve GDOES depth resolution.
Nondestructive analysis and development
NASA Technical Reports Server (NTRS)
Moslehy, Faissal A.
1993-01-01
This final report summarizes the achievements of project #4 of the NASA/UCF Cooperative Agreement from January 1990 to December 1992. The objectives of this project are to review NASA's NDE program at Kennedy Space Center (KSC) and recommend means for enhancing the present testing capabilities through the use of improved or new technologies. During the period of the project, extensive development of a reliable nondestructive, non-contact vibration technique to determine and quantify the bond condition of the thermal protection system (TPS) tiles of the Space Shuttle Orbiter was undertaken. Experimental modal analysis (EMA) is used as a non-destructive technique for the evaluation of Space Shuttle thermal protection system (TPS) tile bond integrity. Finite element (FE) models for tile systems were developed and were used to generate their vibration characteristics (i.e. natural frequencies and mode shapes). Various TPS tile assembly configurations as well as different bond conditions were analyzed. Results of finite element analyses demonstrated a drop in natural frequencies and a change in mode shapes which correlate with both size and location of disbond. Results of experimental testing of tile panels correlated with FE results and demonstrated the feasibility of EMA as a viable technique for tile bond verification. Finally, testing performed on the Space Shuttle Columbia using a laser doppler velocimeter demonstrated the application of EMA, when combined with FE modeling, as a non-contact, non-destructive bond evaluation technique.
Software Safety Analysis of a Flight Guidance System
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Tavares, Ana P M; Silva, Rui P; Amaral, António L; Ferreira, Eugénio C; Xavier, Ana M R B
2014-02-01
Image analysis technique was applied to identify morphological changes of pellets from white-rot fungus Trametes versicolor on agitated submerged cultures during the production of exopolysaccharide (EPS) or ligninolytic enzymes. Batch tests with four different experimental conditions were carried out. Two different culture media were used, namely yeast medium or Trametes defined medium and the addition of lignolytic inducers as xylidine or pulp and paper industrial effluent were evaluated. Laccase activity, EPS production, and final biomass contents were determined for batch assays and the pellets morphology was assessed by image analysis techniques. The obtained data allowed establishing the choice of the metabolic pathways according to the experimental conditions, either for laccase enzymatic production in the Trametes defined medium, or for EPS production in the rich Yeast Medium experiments. Furthermore, the image processing and analysis methodology allowed for a better comprehension of the physiological phenomena with respect to the corresponding pellets morphological stages.
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
Project Sell, Title VII: Final Evaluation 1970-1971.
ERIC Educational Resources Information Center
Condon, Elaine C.; And Others
This evaluative report consists of two parts. The first is a narrative report which represents a summary by the evaluation team and recommendations regarding project activities; the second part provides a statistical analysis of project achievements. Details are provided on evaluation techniques, staff, management, instructional materials,…
Models of railroad passenger-car requirements in the northeast corridor : volume II user's guide
DOT National Transportation Integrated Search
1976-09-30
Models and techniques for determining passenger-car requirements in railroad service were developed and applied by a research project of which this is the final report. The report is published in two volumes. The solution and analysis of the Northeas...
NASA Technical Reports Server (NTRS)
Eigen, D. J.; Fromm, F. R.; Northouse, R. A.
1974-01-01
A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.
Inglis, Jeremy D.; Maassen, Joel; Kara, Azim; ...
2017-04-28
This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less
2014-01-01
This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inglis, Jeremy D.; Maassen, Joel; Kara, Azim
This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less
Improved motors for utility applications: Volume 6, Squirrel-cage rotor analysis: Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, J.W.; McCoy, R.M.
1986-11-01
An analysis of squirrel cage induction motor rotors was undertaken in response to an Industry Assessment Study finding 10% of motor failures to be rotor related. The analysis focuses on evaluating rotor design life. The evaluation combines state-of-the-art electromagnetic, thermal, and structural solution techniques into an integrated analysis and presents a simple summary. Finite element techniques are central tools in the analysis. The analysis is applied to a specific forced draft fan drive design. Fans as a category of application have a higher failure rate than other categories of power station auxiliary motor applications. Forced-draft fan drives are one ofmore » the major fan drives which accelerate a relatively high value of rotor load inertia. Various starting and operating conditions are studied for this forced-draft fan drive motor including a representative application duty cycle.« less
NASA Astrophysics Data System (ADS)
Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang
2018-02-01
Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.
The use of cluster analysis techniques in spaceflight project cost risk estimation
NASA Technical Reports Server (NTRS)
Fox, G.; Ebbeler, D.; Jorgensen, E.
2003-01-01
Project cost risk is the uncertainty in final project cost, contingent on initial budget, requirements and schedule. For a proposed mission, a dynamic simulation model relying for some of its input on a simple risk elicitation is used to identify and quantify systemic cost risk.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.
DOT National Transportation Integrated Search
2002-07-01
Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...
DiBartolomeis, Susan M
2011-01-01
Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky(73). Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers.
DiBartolomeis, Susan M.
2011-01-01
Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky73. Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers. PMID:21364104
Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk
2011-08-01
A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
The application of visible absorption spectroscopy to the analysis of uranium in aqueous solutions
Colletti, Lisa Michelle; Copping, Roy; Garduno, Katherine; ...
2017-07-18
Through assay analysis into an excess of 1 M H 2SO 4 at fixed temperature a technique has been developed for uranium concentration analysis by visible absorption spectroscopy over an assay concentration range of 1.8 – 13.4 mgU/g. Once implemented for a particular spectrophotometer and set of spectroscopic cells this technique promises to provide more rapid results than a classical method such as Davies-Gray (DG) titration analysis. While not as accurate and precise as the DG method, a comparative analysis study reveals that the spectroscopic method can analyze for uranium in well characterized uranyl(VI) solution samples to within 0.3% ofmore » the DG results. For unknown uranium solutions in which sample purity is less well defined agreement between the developed spectroscopic method and DG analysis is within 0.5%. The technique can also be used to detect the presence of impurities that impact the colorimetric analysis, as confirmed through the analysis of ruthenium contamination. Finally, extending the technique to other assay solution, 1 M HNO 3, HCl and Na 2CO 3, has also been shown to be viable. As a result, of the four aqueous media the carbonate solution yields the largest molar absorptivity value at the most intensely absorbing band, with the least impact of temperature.« less
NASA Astrophysics Data System (ADS)
Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.
2016-05-01
Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.
A homotopy analysis method for the nonlinear partial differential equations arising in engineering
NASA Astrophysics Data System (ADS)
Hariharan, G.
2017-05-01
In this article, we have established the homotopy analysis method (HAM) for solving a few partial differential equations arising in engineering. This technique provides the solutions in rapid convergence series with computable terms for the problems with high degree of nonlinear terms appearing in the governing differential equations. The convergence analysis of the proposed method is also discussed. Finally, we have given some illustrative examples to demonstrate the validity and applicability of the proposed method.
Subsynchronous instability of a geared centrifugal compressor of overhung design
NASA Technical Reports Server (NTRS)
Hudson, J. H.; Wittman, L. J.
1980-01-01
The original design analysis and shop test data are presented for a three stage (poster) air compressor with impellers mounted on the extensions of a twin pinion gear, and driven by an 8000 hp synchronous motor. Also included are field test data, subsequent rotor dynamics analysis, modifications, and final rotor behavior. A subsynchronous instability existed on a geared, overhung rotor. State-of-the-art rotor dynamics analysis techniques provided a reasonable analytical model of the rotor. A bearing modification arrived at analytically eliminated the instability.
Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.
Rodriguez-Cruz, Sandra E
2006-01-01
The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.
Quantification of rare earth elements using laser-induced breakdown spectroscopy
Martin, Madhavi; Martin, Rodger C.; Allman, Steve; ...
2015-10-21
In this paper, a study of the optical emission as a function of concentration of laser-ablated yttrium (Y) and of six rare earth elements, europium (Eu), gadolinium (Gd), lanthanum (La), praseodymium (Pr), neodymium (Nd), and samarium (Sm), has been evaluated using the laser-induced breakdown spectroscopy (LIBS) technique. Statistical methodology using multivariate analysis has been used to obtain the sampling errors, coefficient of regression, calibration, and cross-validation of measurements as they relate to the LIBS analysis in graphite-matrix pellets that were doped with elements at several concentrations. Each element (in oxide form) was mixed in the graphite matrix in percentages rangingmore » from 1% to 50% by weight and the LIBS spectra obtained for each composition as well as for pure oxide samples. Finally, a single pellet was mixed with all the elements in equal oxide masses to determine if we can identify the elemental peaks in a mixed pellet. This dataset is relevant for future application to studies of fission product content and distribution in irradiated nuclear fuels. These results demonstrate that LIBS technique is inherently well suited for the future challenge of in situ analysis of nuclear materials. Finally, these studies also show that LIBS spectral analysis using statistical methodology can provide quantitative results and suggest an approach in future to the far more challenging multielemental analysis of ~ 20 primary elements in high-burnup nuclear reactor fuel.« less
Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report
NASA Technical Reports Server (NTRS)
Basu, S. N.
1984-01-01
Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.
NASA Technical Reports Server (NTRS)
1996-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.
Investigation of safety analysis methods using computer vision techniques
NASA Astrophysics Data System (ADS)
Shirazi, Mohammad Shokrolah; Morris, Brendan Tran
2017-09-01
This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.
Investigation on a Roman copper alloy artefact from Pompeii (Italy).
Baraldi, Pietro; Baraldi, Cecilia; Ferrari, Giorgia; Foca, Giorgia; Marchetti, Andrea; Tassi, Lorenzo
2006-01-01
A selection of samples, obtained from a particular copper-alloy domestic artefact of Roman style from Pompeii, has been analysed by using different techniques (IR, Raman, SEM-EDX, FAAS), in order to investigate the chemical nature and composition of the metals utilised for such manufacturing pieces. The surface analysis of the bright red metallic microfragments conducted by different analytical techniques, emphasises the presence of pure unalloyed copper and confirms the absence of other metallic species on the upper layers. On the contrary, the mapping analysis of the section of the laminar metal of the investigated sample shows a consistent enrichment in tin content. Finally, destructive analysis by FAAS confirms that the artefact looks like a bronze metal alloy, with a medium Sn content of about 6.5%.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
Analysis of aircraft longitudinal handling qualities
NASA Technical Reports Server (NTRS)
Hess, R. A.
1981-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca
2018-01-01
Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
2017-03-01
proposed. Expected profiles can incorporate a level of overdesign. Finally, the Design Integrity measuring techniques are applied to five Test Article ...Inserted into Test System Table 2 presents the results of the analysis applied to each of the test article designs. Each of the domains are...the lowest integrities. Based on the analysis, the DI metric shows measurable differentiation between all five Test Article Error Location Error
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korte, Andrew R
This thesis presents efforts to improve the methodology of matrix-assisted laser desorption ionization-mass spectrometry imaging (MALDI-MSI) as a method for analysis of metabolites from plant tissue samples. The first chapter consists of a general introduction to the technique of MALDI-MSI, and the sixth and final chapter provides a brief summary and an outlook on future work.
Hernández, Carla Navarro; Martín-Yerga, Daniel; González-García, María Begoña; Hernández-Santos, David; Fanjul-Bolado, Pablo
2018-02-01
Naratriptan, active pharmaceutical ingredient with antimigraine activity was electrochemically detected in untreated screen-printed carbon electrodes (SPCEs). Cyclic voltammetry and differential pulse voltammetry were used to carry out quantitative analysis of this molecule (in a Britton-Robinson buffer solution at pH 3.0) through its irreversible oxidation (diffusion controlled) at a potential of +0.75V (vs. Ag pseudoreference electrode). Naratriptan oxidation product is an indole based dimer with a yellowish colour (maximum absorption at 320nm) so UV-VIS spectroelectrochemistry technique was used for the very first time as an in situ characterization and quantification technique for this molecule. A reflection configuration approach allowed its measurement over the untreated carbon based electrode. Finally, time resolved Raman Spectroelectrochemistry is used as a powerful technique to carry out qualitative and quantitative analysis of Naratriptan. Electrochemically treated silver screen-printed electrodes are shown as easy to use and cost-effective SERS substrates for the analysis of Naratriptan. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Zhaowei; Li, Peiwu; Hu, Xiaofeng; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen
2012-01-01
Chemical contaminants in food have caused serious health issues in both humans and animals. Microarray technology is an advanced technique suitable for the analysis of chemical contaminates. In particular, immuno-microarray approach is one of the most promising methods for chemical contaminants analysis. The use of microarrays for the analysis of chemical contaminants is the subject of this review. Fabrication strategies and detection methods for chemical contaminants are discussed in detail. Application to the analysis of mycotoxins, biotoxins, pesticide residues, and pharmaceutical residues is also described. Finally, future challenges and opportunities are discussed.
Fire-protection research for energy technology: Fy 80 year end report
NASA Astrophysics Data System (ADS)
Hasegawa, H. K.; Alvares, N. J.; Lipska, A. E.; Ford, H.; Priante, S.; Beason, D. G.
1981-05-01
This continuing research program was initiated in order to advance fire protection strategies for Fusion Energy Experiments (FEE). The program expanded to encompass other forms of energy research. Accomplishments for fiscal year 1980 were: finalization of the fault-free analysis of the Shiva fire management system; development of a second-generation, fire-growth analysis using an alternate model and new LLNL combustion dynamics data; improvements of techniques for chemical smoke aerosol analysis; development and test of a simple method to assess the corrosive potential of smoke aerosols; development of an initial aerosol dilution system; completion of primary small-scale tests for measurements of the dynamics of cable fires; finalization of primary survey format for non-LLNL energy technology facilities; and studies of fire dynamics and aerosol production from electrical insulation and computer tape cassettes.
Determining association constants from titration experiments in supramolecular chemistry.
Thordarson, Pall
2011-03-01
The most common approach for quantifying interactions in supramolecular chemistry is a titration of the guest to solution of the host, noting the changes in some physical property through NMR, UV-Vis, fluorescence or other techniques. Despite the apparent simplicity of this approach, there are several issues that need to be carefully addressed to ensure that the final results are reliable. This includes the use of non-linear rather than linear regression methods, careful choice of stoichiometric binding model, the choice of method (e.g., NMR vs. UV-Vis) and concentration of host, the application of advanced data analysis methods such as global analysis and finally the estimation of uncertainties and confidence intervals for the results obtained. This tutorial review will give a systematic overview of all these issues-highlighting some of the key messages herein with simulated data analysis examples.
Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Morelli, Eugene A.
2014-01-01
Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.
O'Rourke, Matthew B; Padula, Matthew P
2016-01-01
Since emerging in the late 19(th) century, formaldehyde fixation has become a standard method for preservation of tissues from clinical samples. The advantage of formaldehyde fixation is that fixed tissues can be stored at room temperature for decades without concern for degradation. This has led to the generation of huge tissue banks containing thousands of clinically significant samples. Here we review techniques for proteomic analysis of formalin-fixed, paraffin-embedded (FFPE) tissue samples with a specific focus on the methods used to extract and break formaldehyde crosslinks. We also discuss an error-of-interpretation associated with the technique known as "antigen retrieval." We have discovered that this term has been mistakenly applied to two disparate molecular techniques; therefore, we argue that a terminology change is needed to ensure accurate reporting of experimental results. Finally, we suggest that more investigation is required to fully understand the process of formaldehyde fixation and its subsequent reversal.
FT-IR spectroscopic, thermal analysis of human urinary stones and their characterization
NASA Astrophysics Data System (ADS)
Selvaraju, R.; Raja, A.; Thiruppathi, G.
2015-02-01
In the present study, FT-IR, XRD, TGA-DTA spectral methods have been used to investigate the chemical compositions of urinary calculi. Multi-components of urinary calculi such as calcium oxalate, hydroxyl apatite, struvite and uric acid have been studied. The chemical compounds are identified by FT-IR spectroscopic technique. The mineral identification was confirmed by powder X-ray diffraction patterns as compared with JCPDS reported values. Thermal analysis techniques are considered the best techniques for the characterization and detection of endothermic and exothermic behaviors of the urinary stones. The percentages of each hydrate (COM and COD) are present together, in the presences of MAPH or UA. Finally, the present study suggests that the Urolithiasis is significant health problem in children, and is very common in some parts of the world, especially in India. So that present study is so useful and helpful to the scientific community for identification of latest human health problems and their remedies using spectroscopic techniques.
Rare cell isolation and analysis in microfluidics
Chen, Yuchao; Li, Peng; Huang, Po-Hsun; Xie, Yuliang; Mai, John D.; Wang, Lin; Nguyen, Nam-Trung; Huang, Tony Jun
2014-01-01
Rare cells are low-abundance cells in a much larger population of background cells. Conventional benchtop techniques have limited capabilities to isolate and analyze rare cells because of their generally low selectivity and significant sample loss. Recent rapid advances in microfluidics have been providing robust solutions to the challenges in the isolation and analysis of rare cells. In addition to the apparent performance enhancements resulting in higher efficiencies and sensitivity levels, microfluidics provides other advanced features such as simpler handling of small sample volumes and multiplexing capabilities for high-throughput processing. All of these advantages make microfluidics an excellent platform to deal with the transport, isolation, and analysis of rare cells. Various cellular biomarkers, including physical properties, dielectric properties, as well as immunoaffinities, have been explored for isolating rare cells. In this Focus article, we discuss the design considerations of representative microfluidic devices for rare cell isolation and analysis. Examples from recently published works are discussed to highlight the advantages and limitations of the different techniques. Various applications of these techniques are then introduced. Finally, a perspective on the development trends and promising research directions in this field are proposed. PMID:24406985
Topçuoğlu, Hüseyin Sinan; Tuncay, Öznur; Demirbuga, Sezer; Dinçer, Asiye Nur; Arslan, Hakan
2014-06-01
The aim of this study was to evaluate whether or not different final irrigation activation techniques affect the bond strength of an epoxy resin-based endodontic sealer (AH Plus; Dentsply DeTrey, Konstanz, Germany) to the root canal walls of different root thirds. Eighty single-rooted human mandibular premolars were prepared by using the ProTaper system (Dentsply Maillefer, Ballaigues, Switzerland) to size F4, and a final irrigation regimen using 3% sodium hypochlorite and 17% EDTA was performed. The specimens were randomly divided into 4 groups (n = 20) according to the final irrigation activation technique used as follows: no activation (control), manual dynamic activation (MDA), CanalBrush (Coltene Whaledent, Altststten, Switzerland) activation, and ultrasonic activation. Five specimens from each group were prepared for scanning electron microscopic observation to assess the smear layer removal after the final irrigation procedures. All remaining roots were then obturated with gutta-percha and AH Plus sealer. A push-out test was used to measure the bond strength between the root canal dentin and AH Plus sealer. The data obtained from the push-out test were analyzed using 2-way analysis of variance and Tukey post hoc tests. The bond strength values mostly decreased in the coronoapical direction (P < .001). In the coronal and middle thirds, ultrasonic activiation showed a higher bond strength than other groups (P < .05). In the apical third, MDA displayed the highest bond strength to root dentin (P < .05). The majority of specimens exhibited cohesive failures. The bond strength of AH Plus sealer to root canal dentin may improve with ultrasonic activation in the coronal and middle thirds and MDA in the apical third. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Bagur, M G; Morales, S; López-Chicano, M
2009-11-15
Unsupervised and supervised pattern recognition techniques such as hierarchical cluster analysis, principal component analysis, factor analysis and linear discriminant analysis have been applied to water samples recollected in Rodalquilar mining district (Southern Spain) in order to identify different sources of environmental pollution caused by the abandoned mining industry. The effect of the mining activity on waters was monitored determining the concentration of eleven elements (Mn, Ba, Co, Cu, Zn, As, Cd, Sb, Hg, Au and Pb) by inductively coupled plasma mass spectrometry (ICP-MS). The Box-Cox transformation has been used to transform the data set in normal form in order to minimize the non-normal distribution of the geochemical data. The environmental impact is affected mainly by the mining activity developed in the zone, the acid drainage and finally by the chemical treatment used for the benefit of gold.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Vibration Signature Analysis of a Faulted Gear Transmission System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.
1994-01-01
A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.
Bilek, Maciej; Namieśnik, Jacek
2016-01-01
For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".
An electrooptic probe to determine internal electric fields in a piezoelectric transformer.
Norgard, Peter; Kovaleski, Scott
2012-02-01
A technique using the electrooptic effect to determine the output voltage of an optically clear LiNbO(3) piezoelectric transformer was developed and explored. A brief mathematical description of the solution is provided, as well as experimental data demonstrating a linear response under ac resonant operating conditions. A technique to calibrate the diagnostic was developed and is described. Finally, a sensitivity analysis of the electrooptic response to variations in angular alignment between the LiNbO(3) transformer and the laser probe are discussed.
A community assessment of privacy preserving techniques for human genomes
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques. PMID:25521230
A community assessment of privacy preserving techniques for human genomes.
Jiang, Xiaoqian; Zhao, Yongan; Wang, Xiaofeng; Malin, Bradley; Wang, Shuang; Ohno-Machado, Lucila; Tang, Haixu
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques.
Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques
NASA Astrophysics Data System (ADS)
Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José
This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.
Air-to-air radar flight testing
NASA Astrophysics Data System (ADS)
Scott, Randall E.
1988-06-01
This volume in the AGARD Flight Test Techniques Series describes flight test techniques, flight test instrumentation, ground simulation, data reduction and analysis methods used to determine the performance characteristics of a modern air-to-air (a/a) radar system. Following a general coverage of specification requirements, test plans, support requirements, development and operational testing, and management information systems, the report goes into more detailed flight test techniques covering a/a radar capabilities of: detection, manual acquisition, automatic acquisition, tracking a single target, and detection and tracking of multiple targets. There follows a section on additional flight test considerations such as electromagnetic compatibility, electronic countermeasures, displays and controls, degraded and backup modes, radome effects, environmental considerations, and use of testbeds. Other sections cover ground simulation, flight test instrumentation, and data reduction and analysis. The final sections deal with reporting and a discussion of considerations for the future and how they may affect radar flight testing.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
NASA Technical Reports Server (NTRS)
Winitz, M.; Graff, J. (Inventor)
1974-01-01
The process and apparatus for qualitative and quantitative analysis of the amino acid content of a biological sample are presented. The sample is deposited on a cation exchange resin and then is washed with suitable solvents. The amino acids and various cations and organic material with a basic function remain on the resin. The resin is eluted with an acid eluant, and the eluate containing the amino acids is transferred to a reaction vessel where the eluant is removed. Final analysis of the purified acylated amino acid esters is accomplished by gas-liquid chromatographic techniques.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
Alonso, J F; Mañanas, M A; Hoyer, D; Topor, Z L; Bruce, E N
2004-01-01
Analysis of respiratory muscle activity is a promising technique for the study of pulmonary diseases such as obstructive sleep apnea syndrome (OSAS). Evaluation of interactions between muscles is very useful in order to determine the muscular pattern during an exercise. These interactions have already been assessed by means of different linear techniques like cross-spectrum, magnitude squared coherence or cross-correlation. The aim of this work is to evaluate interactions between respiratory and myographic signals through nonlinear analysis by means of cross mutual information function (CMIF), and finding out what information can be extracted from it. Some parameters are defined and calculated from CMIF between ventilatory and myographic signals of three respiratory muscles. Finally, differences in certain parameters were obtained between OSAS patients and healthy subjects indicating different respiratory muscle couplings.
Prosa, T J; Alvis, R; Tsakalakos, L; Smentkowski, V S
2010-08-01
Three-dimensional quantitative compositional analysis of nanowires is a challenge for standard techniques such as secondary ion mass spectrometry because of specimen size and geometry considerations; however, it is precisely the size and geometry of nanowires that makes them attractive candidates for analysis via atom probe tomography. The resulting boron composition of various trimethylboron vapour-liquid-solid grown silicon nanowires were measured both with time-of-flight secondary ion mass spectrometry and pulsed-laser atom probe tomography. Both characterization techniques yielded similar results for relative composition. Specialized specimen preparation for pulsed-laser atom probe tomography was utilized and is described in detail whereby individual silicon nanowires are first protected, then lifted out, trimmed, and finally wet etched to remove the protective layer for subsequent three-dimensional analysis.
Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S
2018-03-01
Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.
2012-03-01
4 Body ...Final Report requirement. 5 Body The approved Statement of Work proposed the following timeline (Table 1): Table 1. Timeline for...prosthesis designs (Figure 1) were tested for this project including the 1E90 Sprinter (OttoBock Inc.), Flex-Run (Ossur), Cheetah ® (Ossur) and Nitro
Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.
ERIC Educational Resources Information Center
Lindahl, William H.; Gardner, James H.
Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…
ERIC Educational Resources Information Center
Girod, Gerald R.
An experiment was performed to determine the efficiency of simulation teaching techniques in training elementary education teachers to identify and correct classroom management problems. The two presentation modes compared were film and audiotape. Twelve hypotheses were tested via analysis of variance to determine the relative efficiency of these…
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
Analyzing coastal environments by means of functional data analysis
NASA Astrophysics Data System (ADS)
Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.
2017-07-01
Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.
Segmental Refinement: A Multigrid Technique for Data Locality
Adams, Mark F.; Brown, Jed; Knepley, Matt; ...
2016-08-04
In this paper, we investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. Finally, we present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinementmore » and report performance results with up to 64K cores on a Cray XC30.« less
Exploring the relationships between free-time management and boredom in leisure.
Wang, Wei-Ching; Wu, Chung-Chi; Wu, Chang-Yang; Huan, Tzung-Cheng
2012-04-01
The purpose of the study was to examine the relations of five dimensions of free-time management (including goal setting and evaluating, technique, values, immediate response, and scheduling) with leisure boredom, and whether these factors could predict leisure boredom. A total of 500 undergraduates from a university in southern Taiwan were surveyed with 403 usable questionnaires was returned. Pearson correlation analysis revealed that five dimensions of free-time management had significant negative relationships with leisure boredom. Furthermore, the results of stepwise regression analysis revealed that four dimensions of free-time management were significant contributors to leisure boredom. Finally, we suggested students can avoid boredom by properly planning and organizing leisure time and applying techniques for managing leisure time.
An Optimal Order Nonnested Mixed Multigrid Method for Generalized Stokes Problems
NASA Technical Reports Server (NTRS)
Deng, Qingping
1996-01-01
A multigrid algorithm is developed and analyzed for generalized Stokes problems discretized by various nonnested mixed finite elements within a unified framework. It is abstractly proved by an element-independent analysis that the multigrid algorithm converges with an optimal order if there exists a 'good' prolongation operator. A technique to construct a 'good' prolongation operator for nonnested multilevel finite element spaces is proposed. Its basic idea is to introduce a sequence of auxiliary nested multilevel finite element spaces and define a prolongation operator as a composite operator of two single grid level operators. This makes not only the construction of a prolongation operator much easier (the final explicit forms of such prolongation operators are fairly simple), but the verification of the approximate properties for prolongation operators is also simplified. Finally, as an application, the framework and technique is applied to seven typical nonnested mixed finite elements.
Design and development of a quad copter (UMAASK) using CAD/CAM/CAE
NASA Astrophysics Data System (ADS)
Manarvi, Irfan Anjum; Aqib, Muhammad; Ajmal, Muhammad; Usman, Muhammad; Khurshid, Saqib; Sikandar, Usman
Micro flying vehicles1 (MFV) have become a popular area of research due to economy of production, flexibility of launch and variety of applications. A large number of techniques from pencil sketching to computer based software are being used for designing specific geometries and selection of materials to arrive at novel designs for specific requirements. Present research was focused on development of suitable design configuration using CAD/CAM/CAE tools and techniques. A number of designs were reviewed for this purpose. Finally, rotary wing Quadcopter flying vehicle design was considered appropriate for this research. Performance requirements were planned as approximately 10 meters ceiling, weight less than 500grams and ability to take videos and pictures. Parts were designed using Finite Element Analysis, manufactured using CNC machines and assembled to arrive at final design named as UMAASK. Flight tests were carried out which confirmed the design requirements.
Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D
2016-10-01
In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.
New Ground Truth Capability from InSAR Time Series Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, S; Vincent, P; Yang, D
2005-07-13
We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less
Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.
2014-01-01
Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811
Umeta, Ricardo S G; Avanzi, Osmar
2011-07-01
Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sayar, M.; Ogawa, K.; Shoji, T.
2008-02-01
Thermal barrier coatings have been widely used in gas turbine engines in order to protect substrate metal alloy against high temperature and to enhance turbine efficiency. Currently, there are no reliable nondestructive techniques available to monitor TBC integrity over lifetime of the coating. Hence, to detect top coating (TC) and TGO thicknesses, a microwave nondestructive technique that utilizes a rectangular waveguide was developed. The phase of the reflection coefficient at the interface of TC and waveguide varies for different TGO and TC thicknesses. Therefore, measuring the phase of the reflection coefficient enables us to accurately calculate these thicknesses. Finally, a theoretical analysis was used to evaluate the reliability of the experimental results.
Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling
Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.
2013-01-01
Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111
The mechanisms of renal tubule electrolyte and water absorption, 100 years after Carl Ludwig.
Greger, R
1996-01-01
Some 154 years after Carl Ludwig's Habilitationsschrift "Contributions to the theory of the mechanism of urine secretion" renal physiology has come a long way. The mechanisms of urine formation are now understood as the result of glomerular filtration and tubule absorption of most of the filtrate. The detailed understanding of tubule transport processes has become possible with the invention of several refined techniques such as the micropuncture techniques; the microchemical analysis of nanolitre tubule fluid samples; the in vitro perfusion of isolated tubule segments of defined origin; electrophysiological analysis of electrolyte transport including micropuncture and patch-clamp techniques; transport studies in membrane vesicle preparations; recordings of intracellular electrolyte concentrations and cloning techniques of the individual membrane transport proteins. With this wealth of information we are now starting to build an integrative understanding of the function of the individual nephron segments, the regulatory processes, the integrated function of the nephron and hence the formation of the final urine. Like anatomists of previous centuries we still state that the kidney is an "organum mirable" and we recognize that basic research in this area has fertilized the analysis of the function of a large number of other organs and cells.
Barile, Claudia; Casavola, Caterina; Pappalettera, Giovanni; Pappalettere, Carmine
2014-01-01
Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value.
2014-01-01
Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value. PMID:25276850
Hahn, David W; Omenetto, Nicoló
2012-04-01
The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy
Detection of proteolytic activity by covalent tethering of fluorogenic substrates in zymogram gels.
Deshmukh, Ameya A; Weist, Jessica L; Leight, Jennifer L
2018-05-01
Current zymographic techniques detect only a subset of known proteases due to the limited number of native proteins that have been optimized for incorporation into polyacrylamide gels. To address this limitation, we have developed a technique to covalently incorporate fluorescently labeled, protease-sensitive peptides using an azido-PEG3-maleimide crosslinker. Peptides incorporated into gels enabled measurement of MMP-2, -9, -14, and bacterial collagenase. Sensitivity analysis demonstrated that use of peptide functionalized gels could surpass detection limits of current techniques. Finally, electrophoresis of conditioned media from cultured cells resulted in the appearance of several proteolytic bands, some of which were undetectable by gelatin zymography. Taken together, these results demonstrate that covalent incorporation of fluorescent substrates can greatly expand the library of detectable proteases using zymographic techniques.
Analysis and application of intelligence network based on FTTH
NASA Astrophysics Data System (ADS)
Feng, Xiancheng; Yun, Xiang
2008-12-01
With the continued rapid growth of Internet, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. The bandwidth requirement increase continuously. Network technique, optical device technical development is swift and violent. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network. Firstly, it introduces the main service which FTTH supports, main analysis key technology such as FTTH system composition way, topological structure, multiplexing, optical cable and device. It focus two kinds of realization methods - PON, P2P technology. Then it proposed that the solution of FTTH can support comprehensive access (service such as broadband data, voice, video and narrowband private line). Finally, it shows the engineering application for FTTH in the district and building. It brings enormous economic benefits and social benefit.
Using sentiment analysis to review patient satisfaction data located on the internet.
Hopper, Anthony M; Uriyo, Maria
2015-01-01
The purpose of this paper is to test the usefulness of sentiment analysis and time-to-next-complaint methods in quantifying text-based information located on the internet. As important, the authors demonstrate how managers can use time-to-next-complaint techniques to organize sentiment analysis derived data into useful information, which can be shared with doctors and other staff. The authors used sentiment analysis to review patient feedback for a select group of gynecologists in Virginia. The authors utilized time-to-next-complaint methods along with other techniques to organize this data into meaningful information. The authors demonstrated that sentiment analysis and time-to-next-complaint techniques might be useful tools for healthcare managers who are interested in transforming web-based text into meaningful, quantifiable information. This study has several limitations. For one thing, neither the data set nor the techniques the authors used to analyze it will account for biases that resulted from selection issues related to gender, income, and culture, as well as from other socio-demographic concerns. Additionally, the authors lacked key data concerning patient volumes for the targeted physicians. Finally, it may be difficult to convince doctors to consider web-based comments as truthful, thereby preventing healthcare managers from using data located on the internet. The report illustrates some of the ways in which healthcare administrators can utilize sentiment analysis, along with time-to-next-complaint techniques, to mine web-based, patient comments for meaningful information. The paper is one of the first to illustrate ways in which administrators at clinics and physicians' offices can utilize sentiment analysis and time-to-next-complaint methods to analyze web-based patient comments.
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-01
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-05
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.
Chapter 14: Electron Microscopy on Thin Films for Solar Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Manuel; Abou-Ras, Daniel; Nichterwitz, Melanie
2016-07-22
This chapter overviews the various techniques applied in scanning electron microscopy (SEM) and transmission electron microscopy (TEM), and highlights their possibilities and also limitations. It gives the various imaging and analysis techniques applied on a scanning electron microscope. The chapter shows that imaging is divided into that making use of secondary electrons (SEs) and of backscattered electrons (BSEs), resulting in different contrasts in the images and thus providing information on compositions, microstructures, and surface potentials. Whenever aiming for imaging and analyses at scales of down to the angstroms range, TEM and its related techniques are appropriate tools. In many cases,more » also SEM techniques provide the access to various material properties of the individual layers, not requiring specimen preparation as time consuming as TEM techniques. Finally, the chapter dedicates to cross-sectional specimen preparation for electron microscopy. The preparation decides indeed on the quality of imaging and analyses.« less
ERIC Educational Resources Information Center
Lou, Yu-Chiung; Lin, Hsiao-Fang; Lin, Chin-Wen
2013-01-01
The aims of the study were (a) to develop a scale to measure university students' task value and (b) to use confirmatory factor analytic techniques to investigate the construct validity of the scale. The questionnaire items were developed based on theoretical considerations and the final version contained 38 items divided into 4 subscales.…
ERIC Educational Resources Information Center
McKeag, Janis
To address specific workplace literacy needs within the steel fabrication sector in Manitoba, an organizational needs assessment was conducted training manuals were developed using literacy task analysis techniques. The organizational needs assessment assessed the general and workplace literacy tasks and demands of hourly workers at Dominion…
Design of an Orbital Inspection Satellite
1986-12-01
ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNITELEMENT NO. NO. NO. CCESSION NO. 11. TITLE (include...Captain, USAF Dh t ibutioni Availabiity Codes Avail adlor Dist [Special December 1986 Approved for public release; distribution...lends itself to the technique of multi -objective analysis. The final step is planning for action. This communicates the entire systems engineering
Improved Decision Making Through Group Composition
1991-09-01
design for this research was actually a quasiexperimental design , because equivalent experimental and control groups could not be guaranteed...The Nonequivalent Control Group Design (25:126) O1 X 02 03 04 control group after the simulation exercise. Final game scores were collected from 02... Control Group Design ...... 88 20. Data Analysis Techniques ............. 98 21. The Control and Treatment Team Members’
Browns Ferry Unit-3 cavity neutron spectral analysis. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, G.C.
1981-08-01
This report describes neutron dosimetry measurements performed in the Browns Ferry Unit-3 reactor cavity using multiple dosimeter and spectrum unfolding techniques to assess radiation-induced degradation of nuclear plant pressure vessels. Test results and conclusions indicating the feasibility of determining neutron flux spectra and the densities in the pressure vessel cavity region via dosimetric measurements are presented.
NASA Astrophysics Data System (ADS)
Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu
2016-09-01
In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Contamination assessment and control in scientific satellites
NASA Technical Reports Server (NTRS)
Naumann, R. J.
1973-01-01
Techniques for assessment and control of the contamination environment for both particulates and condensible vapors in the vicinity of spacecraft are developed. An analysis of the deposition rate on critical surfaces is made considering sources within the line of sight of the surface in question as well as those obscured from the line of sight. The amount of contamination returned by collision with the surrounding atmosphere is estimated. Scattering and absorption from the induced atmosphere of gases and particulates around the spacecraft are estimated. Finally, design techniques developed for Skylab to reduce the contamination environment to an acceptable level are discussed.
Wallrabe, U; Ruther, P; Schaller, T; Schomburg, W K
1998-03-01
The complexity of modern surgical and analytical methods requires the miniaturisation of many medical devices. The LIGA technique and also mechanical microengineering are well known for the batch fabrication of microsystems. Actuators and sensors are developed based on these techniques. The hydraulic actuation principle is advantageous for medical applications since the energy may be supplied by pressurised balanced salt solution. Some examples are turbines, pumps and valves. In addition, optical sensors and components are useful for analysis and inspection as represented by microspectrometers and spherical lenses. Finally, plastic containers with microporous bottoms allow a 3-dimensional growth of cell culture systems.
NASA Technical Reports Server (NTRS)
1971-01-01
Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)
NASA Astrophysics Data System (ADS)
Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza
2017-12-01
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
Optics for Processes, Products and Metrology
NASA Astrophysics Data System (ADS)
Mather, George
1999-04-01
Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.
Local Guided Wavefield Analysis for Characterization of Delaminations in Composites
NASA Technical Reports Server (NTRS)
Rogge, Matthew D.; Campbell Leckey, Cara A.
2012-01-01
Delaminations in composite laminates resulting from impact events may be accompanied by minimal indication of damage at the surface. As such, inspection techniques are required to ensure defects are within allowable limits. Conventional ultrasonic scanning techniques have been shown to effectively characterize the size and depth of delaminations but require physical contact with the structure. Alternatively, a noncontact scanning laser vibrometer may be used to measure guided wave propagation in the laminate structure. A local Fourier domain analysis method is presented for processing guided wavefield data to estimate spatially-dependent wavenumber values, which can be used to determine delamination depth. The technique is applied to simulated wavefields and results are analyzed to determine limitations of the technique with regards to determining defect size and depth. Finally, experimental wavefield data obtained in quasi-isotropic carbon fiber reinforced polymer (CFRP) laminates with impact damage is analyzed and wavenumber is measured to an accuracy of 8.5% in the region of shallow delaminations. Keywords: Ultrasonic wavefield imaging, Windowed Fourier transforms, Guided waves, Structural health monitoring, Nondestructive evaluation
Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti
2017-08-11
In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.
Comparison of extraction techniques of robenidine from poultry feed samples.
Wilga, Joanna; Wasik, Agata Kot-; Namieśnik, Jacek
2007-10-31
In this paper, effectiveness of six different commonly applied extraction techniques for the determination of robenidine in poultry feed has been compared. The sample preparation techniques included shaking, Soxhlet, Soxtec, ultrasonically assisted extraction, microwave - assisted extraction and accelerated solvent extraction. Comparison of these techniques was done with respect to the recovery extraction, temperature and time, reproducibility and solvent consumption. Every single extract was subjected to clean - up using aluminium oxide column (Pasteur pipette filled with 1g of aluminium oxide), from which robenidine was eluted with 10ml of methanol. The eluate from the clean-up column was collected in a volumetric flask, and finally it was analysed by HPLC-DAD-MS. In general, all extraction techniques were capable of isolating of robenidine from poultry feed, but the recovery obtained using modern extraction techniques was higher than that obtained using conventional techniques. In particular, accelerated solvent extraction was more superior to other techniques, which highlights the advantages of this sample preparation technique. However, in routine analysis, shaking and ultrasonically assisted extraction is still the preferred method for the solution of robenidine and other coccidiostatics.
Sudo, Hideki; Ito, Manabu; Abe, Yuichiro; Abumi, Kuniyoshi; Takahata, Masahiko; Nagahama, Ken; Hiratsuka, Shigeto; Kuroki, Kei; Iwasaki, Norimasa
2014-06-15
Retrospective analysis of a prospectively collected, consecutive, nonrandomized series of patients. To assess the surgical outcomes of the simultaneous double-rod rotation technique for treating Lenke 1 thoracic adolescent idiopathic scoliosis (AIS). With the increasing popularity of segmental pedicle screw spinal reconstruction for treating AIS, concerns regarding the limited ability to correct hypokyphosis have also increased. A consecutive series of 32 patients with Lenke 1 main thoracic AIS treated with the simultaneous double-rod rotation technique at our institution was included. Outcome measures included patient demographics, radiographical measurements, and Scoliosis Research Society questionnaire scores. All 32 patients were followed up for a minimum of 2 years (average, 3.6 yr). The average main thoracic Cobb angle correction rate and the correction loss at the final follow-up were 67.8% and 3.3°, respectively. The average preoperative thoracic kyphosis (T5-T12) was 11.9°, which improved significantly to 20.5° (P < 0.0001) at the final follow-up. An increase in thoracic kyphosis was significantly correlated with an increase in lumbar lordosis at the final follow-up (r = 0.42). The average preoperative vertebral rotation angle was 19.7°, which improved significantly after surgery to 14.9° (P = 0.0001). There was no correlation between change in thoracic kyphosis and change in apical vertebral rotation (r =-0.123). The average preoperative total Scoliosis Research Society questionnaire score was 3.0, which significantly improved to 4.4 (P < 0.0001) at the final follow-up. Throughout surgery and even after, there were no instrumentation failures, pseudarthrosis, infection of the surgical site, or clinically relevant neurovascular complications. The simultaneous double-rod rotation technique for treating Lenke 1 AIS provides significant sagittal correction of the main thoracic curve while maintaining sagittal profiles and correcting coronal and axial deformities. 4.
NASA Technical Reports Server (NTRS)
Bukowski, Richard W.
1987-01-01
An overview is given of the basis for an analysis of combustable materials and potential ignition sources in a spacecraft. First, the burning process is discussed in terms of the production of the fire signatures normally associated with detection devices. These include convected and radiated thermal energy, particulates, and gases. Second, the transport processes associated with the movement of these from the fire to the detector, along with the important phenomena which cause the level of these signatures to be reduced, are described. Third, the operating characteristics of the individual types of detectors which influence their response to signals, are presented. Finally, vulnerability analysis using predictive fire modeling techniques is discussed as a means to establish the necessary response of the detection system to provide the level of protection required in the application.
Montaux-Lambert, Antoine; Mercère, Pascal; Primot, Jérôme
2015-11-02
An interferogram conditioning procedure, for subsequent phase retrieval by Fourier demodulation, is presented here as a fast iterative approach aiming at fulfilling the classical boundary conditions imposed by Fourier transform techniques. Interference fringe patterns with typical edge discontinuities were simulated in order to reveal the edge artifacts that classically appear in traditional Fourier analysis, and were consecutively used to demonstrate the correction efficiency of the proposed conditioning technique. Optimization of the algorithm parameters is also presented and discussed. Finally, the procedure was applied to grating-based interferometric measurements performed in the hard X-ray regime. The proposed algorithm enables nearly edge-artifact-free retrieval of the phase derivatives. A similar enhancement of the retrieved absorption and fringe visibility images is also achieved.
A survey of application: genomics and genetic programming, a new frontier.
Khan, Mohammad Wahab; Alam, Mansaf
2012-08-01
The aim of this paper is to provide an introduction to the rapidly developing field of genetic programming (GP). Particular emphasis is placed on the application of GP to genomics. First, the basic methodology of GP is introduced. This is followed by a review of applications in the areas of gene network inference, gene expression data analysis, SNP analysis, epistasis analysis and gene annotation. Finally this paper concluded by suggesting potential avenues of possible future research on genetic programming, opportunities to extend the technique, and areas for possible practical applications. Copyright © 2012 Elsevier Inc. All rights reserved.
Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...
2013-07-18
The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.
Search for a supersymmetric partner to the top quark using a multivariate analysis technique
NASA Astrophysics Data System (ADS)
Darmora, Smita
Supersymmetry (SUSY) is an extension to the Standard Model (SM) which introduces supersymmetric partners of the known fermions and bosons. Top squark (stop) searches are a natural extension of inclusive SUSY searches at the LHC. If SUSY solves the naturalness problem, the stop should be light enough to cancel the top loop contribution to the Higgs mass parameter. The 3rd generation squarks may be the first SUSY particles to be discovered at the LHC. The stop can decay into a variety of final states, depending, amongst other factors, on the hierarchy of the mass eigenstates formed from the linear superposition of the SUSY partners of the Higgs boson and electroweak gauge bosons. In this study the relevant mass eigenstates are the lightest chargino (chi+/-1) and the neutralino (chi +/-0). A search is presented for a heavy SUSY top partner decaying to a lepton, neutrino and the lightest supersymmetric particle (chi+/-0), via a b-quark and a chargino (chi +/-1) in events with two leptons in the final state. The analysis targets searches for a SUSY top partner by means of Multivariate Analysis Technique, used to discriminate between the stop signal and the background with a learning algorithm based on Monte Carlo generated signal and background samples. The analysis uses data corresponding to 20.3 fb --1 of integrated luminosity at √s = 8 TeV, collected by the ATLAS experiment at the Large Hadron Collider in 2012.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
An improved switching converter model. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Shortt, D. J.
1982-01-01
The nonlinear modeling and analysis of dc-dc converters in the continuous mode and discontinuous mode was done by averaging and discrete sampling techniques. A model was developed by combining these two techniques. This model, the discrete average model, accurately predicts the envelope of the output voltage and is easy to implement in circuit and state variable forms. The proposed model is shown to be dependent on the type of duty cycle control. The proper selection of the power stage model, between average and discrete average, is largely a function of the error processor in the feedback loop. The accuracy of the measurement data taken by a conventional technique is affected by the conditions at which the data is collected.
NASA Astrophysics Data System (ADS)
Gu, Chunxing; Shen, Zongbao; Liu, Huixia; Li, Pin; Lu, Mengmeng; Zhao, Yinxin; Wang, Xiao
2013-04-01
This paper describes a precise and non-contact adjustment technique using the water-confined laser-generated plasma to adjust the curvature of micro-components (micro-mechanical cantilevers). A series of laser shock micro-adjustment experiments were conducted on 0.4 mm-thick Al samples using pulsed Nd:YAG lasers operating at 1064 nm wavelengths to verify the technical feasibility. Systematic study was carried out in the term of effects of various factors on the adjusting results, including laser energies, laser focus positions, laser shock times and confined regime configuration. The research results have shown that the different bending angles and bending directions can be obtained by changing the laser processing parameters. And, for the adjustment process, the absence of confined regime configuration could also generate suitable bending deformation. But, in the case of larger energy, the final surfaces would have the sign of ablation, hence resulting in poor surface quality. An analysis procedure including dynamic analysis performed by ANSYS/LS-DYNA and static analysis performed by ANSYS is presented in detail to attain the simulation of laser shock micro-adjustment to predict the final bending deformation. The predicted bending profiles is well correlated with the available experimental data, showing the finite element analysis can predict the final curvatures of the micro-cantilevers properly.
NASA Astrophysics Data System (ADS)
Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng
2012-12-01
This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.
Development of MRM-based assays for the absolute quantitation of plasma proteins.
Kuzyk, Michael A; Parker, Carol E; Domanski, Dominik; Borchers, Christoph H
2013-01-01
Multiple reaction monitoring (MRM), sometimes called selected reaction monitoring (SRM), is a directed tandem mass spectrometric technique performed on to triple quadrupole mass spectrometers. MRM assays can be used to sensitively and specifically quantify proteins based on peptides that are specific to the target protein. Stable-isotope-labeled standard peptide analogues (SIS peptides) of target peptides are added to enzymatic digests of samples, and quantified along with the native peptides during MRM analysis. Monitoring of the intact peptide and a collision-induced fragment of this peptide (an ion pair) can be used to provide information on the absolute peptide concentration of the peptide in the sample and, by inference, the concentration of the intact protein. This technique provides high specificity by selecting for biophysical parameters that are unique to the target peptides: (1) the molecular weight of the peptide, (2) the generation of a specific fragment from the peptide, and (3) the HPLC retention time during LC/MRM-MS analysis. MRM is a highly sensitive technique that has been shown to be capable of detecting attomole levels of target peptides in complex samples such as tryptic digests of human plasma. This chapter provides a detailed description of how to develop and use an MRM protein assay. It includes sections on the critical "first step" of selecting the target peptides, as well as optimization of MRM acquisition parameters for maximum sensitivity of the ion pairs that will be used in the final method, and characterization of the final MRM assay.
Multiplex gas chromatography for use in space craft
NASA Technical Reports Server (NTRS)
Valentin, J. R.
1985-01-01
Gas chromatography is a powerful technique for the analysis of gaseous mixtures. Some limitations in this technique still exist which can be alleviated with multiplex gas chromatography (MGC). In MGC, rapid multiple sample injections are made into the column without having to wait for one determination to be finished before taking a new sample. The resulting data must then be reduced using computational methods such as cross correlation. In order to efficiently perform multiplexgas chromatography, experiments in the laboratory and on board future space craft, skills, equipment, and computer software were developed. Three new techniques for modulating, i.e., changing, sample concentrations were demonstrated by using desorption, decomposition, and catalytic modulators. In all of them, the need for a separate gas stream as the carrier was avoided by placing the modulator at the head of the column to directly modulate a sample stream. Finally, the analysis of an environmental sample by multiplex chromatography was accomplished by employing silver oxide to catalytically modulate methane in ambient air.
Carmichael, Mary C.; St. Clair, Candace; Edwards, Andrea M.; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale
2016-01-01
Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom’s taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. PMID:27543637
[Comparison of 2 lacrimal punctal occlusion methods].
Shalaby, O; Rivas, L; Rivas, A I; Oroza, M A; Murube, J
2001-09-01
To study and compare two methods for canalicular occlusion: Cautery and Punctal Patch. The study included fourty patients divided in two groups of 20 patients. The end point was 4 occluded puncti. The first group underwent deep cauterization resulting in occlusion of the full vertical aspect of the canaliculus. The second group underwent punctal patch technique for canalicular occlusion. Differential parameters were the following: time of intervention, ease of use, risks and precision. In the post operatory, discomfort, subjective and objective improvement in ocular surface as well as long term result of each technique was analysed. Time of intervention was longer for punctal patch compared to cautery. Both methods exhibited similar ease of use and improvement in ocular surface. Precision was high in punctal patch technique showing complete and final occlusion and no punctum needed reopening, while cautery technique presented 20% rate of reopening intervention. Postoperatory discomfort and irritation were remarkably evident with punctal technique, while minimal in cautery technique. Survival analysis after one year follow up, showed a higher rate of advantages for punctal patch technique over cautery technique.
Midterm clinical outcomes following arthroscopic transosseous rotator cuff repair
Flanagin, Brody A.; Garofalo, Raffaele; Lo, Eddie Y.; Feher, LeeAnne; Castagna, Alessandro; Qin, Huanying; Krishnan, Sumant G.
2016-01-01
Purpose: Arthroscopic transosseous (TO) rotator cuff repair has recently emerged as a new option for surgical treatment of symptomatic rotator cuff tears. Limited data is available regarding outcomes using this technique. This study evaluated midterm clinical outcomes following a novel arthroscopic TO (anchorless) rotator cuff repair technique. Materials and Methods: A consecutive series of 107 patients and 109 shoulders underwent arthroscopic TO (anchorless) rotator cuff repair for a symptomatic full-thickness tear. Pre and postoperative range of motion (ROM) was compared at an average of 11.8 months. Postoperative outcome scores were obtained at an average of 38.0 months. Statistical analysis was performed to compare pre and postoperative ROM data. Univariate analysis was performed using Student's t-test to compare the effect of other clinical characteristics on final outcome. Results: Statistically significant improvements were noted in forward flexion, external rotation and internal rotation (P < 0.0001). Average postoperative subjective shoulder value was 93.7, simple shoulder test 11.6, and American Shoulder and Elbow Surgeons (ASES) score 94.6. According to ASES scores, results for the 109 shoulders available for final follow-up were excellent in 95 (87.1%), good in 8 (7.3%), fair in 3 (2.8%), and poor in 3 (2.8%). There was no difference in ROM or outcome scores in patients who underwent a concomitant biceps procedure (tenodesis or tenotomy) compared with those who did not. Furthermore, there was no significant difference in outcome between patients who underwent either biceps tenodesis or tenotomy. Age, history of injury preceding the onset of pain, tear size, number of TO tunnels required to perform the repair, and presence of fatty infiltration did not correlate with postoperative ROM or subjective outcome measures at final follow-up. Two complications and four failures were noted. Conclusions: Arthroscopic TO rotator cuff repair technique leads to statistically significant midterm improvement in ROM and satisfactory midterm subjective outcome scores with low complication/failure rates in patients with average medium-sized rotator cuff tears with minimal fatty infiltration. Further work is required to evaluate radiographic healing rates with this technique and to compare outcomes following suture anchor repair. Level of Evidence: Level IV PMID:26980983
Posttest analysis of beta (Na/S) cells from chloride silent power, limited. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Battles, J.E.; Mrazek, F.C.
Researchers have developed a unique methodology for examining sodium/sulfur cells after testing to learn more about their behavior. The new techniques described in this report allow scientists to discern the physical and chemical states of these high-energy cells and to develop hypotheses about degradation mechanisms. This information may provide a basis for building cells with longer lives.
ERIC Educational Resources Information Center
Schure, Alexander
A computer-based system model for the monitoring and management of the instructional process was conceived, developed and refined through the techniques of systems analysis. This report describes the various aspects and components of this project in a series of independent and self-contained units. The first unit provides an overview of the entire…
DOT National Transportation Integrated Search
1985-03-01
A report is offered on a study of the information activities within the Right-of-Way section of ADOT. The objectives of the study were to adapt and apply techniques to measure user-perceived needs, satisfaction and utility of services provided Right-...
ERIC Educational Resources Information Center
Muir, Carrie
2012-01-01
The purpose of this study was to compare the performance of first year college students with similar high school mathematics backgrounds in two introductory level college mathematics courses, "Fundamentals and Techniques of College Algebra and Quantitative Reasoning and Mathematical Skills," and to compare the performance of students…
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1982-01-01
The major accomplishments of the Center for Remote Sensing and Cartography are outlined. The analysis and inventory of the Parker Mountain rangeland and the use of multitemporal data to study aspen succession stages are discussed. New and continuing projects are also described including a Salt Lake County land use study, Wasatch-Cache riparian study, and Humboldt River riparian habitat study. Finally, progress in digital processing techniques is reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donohue, Marc; Aranovich, Gregory; Wang, Chao
This project determined the effect of adsorption compression on the rates of catalytic chemical reactions. It was shown that in regions of strong adsorption compression there is a dramatic increase in the rate of catalytic chemical reaction. Experiments focused on the conversion of NO to molecular nitrogen and oxygen. Data analysis techniques were developed to allow interpretation of experimental data and prediction of conditions for optimal reaction rates.
Asadi, Hamed; Kok, Hong Kuan; Looby, Seamus; Brennan, Paul; O'Hare, Alan; Thornton, John
2016-12-01
To identify factors influencing outcome in brain arteriovenous malformations (BAVM) treated with endovascular embolization. We also assessed the feasibility of using machine learning techniques to prognosticate and predict outcome and compared this to conventional statistical analyses. A retrospective study of patients undergoing endovascular treatment of BAVM during a 22-year period in a national neuroscience center was performed. Clinical presentation, imaging, procedural details, complications, and outcome were recorded. The data was analyzed with artificial intelligence techniques to identify predictors of outcome and assess accuracy in predicting clinical outcome at final follow-up. One-hundred ninety-nine patients underwent treatment for BAVM with a mean follow-up duration of 63 months. The commonest clinical presentation was intracranial hemorrhage (56%). During the follow-up period, there were 51 further hemorrhagic events, comprising spontaneous hemorrhage (n = 27) and procedural related hemorrhage (n = 24). All spontaneous events occurred in previously embolized BAVMs remote from the procedure. Complications included ischemic stroke in 10%, symptomatic hemorrhage in 9.8%, and mortality rate of 4.7%. Standard regression analysis model had an accuracy of 43% in predicting final outcome (mortality), with the type of treatment complication identified as the most important predictor. The machine learning model showed superior accuracy of 97.5% in predicting outcome and identified the presence or absence of nidal fistulae as the most important factor. BAVMs can be treated successfully by endovascular techniques or combined with surgery and radiosurgery with an acceptable risk profile. Machine learning techniques can predict final outcome with greater accuracy and may help individualize treatment based on key predicting factors. Copyright © 2016 Elsevier Inc. All rights reserved.
Blood volume analysis: a new technique and new clinical interest reinvigorate a classic study.
Manzone, Timothy A; Dam, Hung Q; Soltis, Daniel; Sagar, Vidya V
2007-06-01
Blood volume studies using the indicator dilution technique and radioactive tracers have been performed in nuclear medicine departments for over 50 y. A nuclear medicine study is the gold standard for blood volume measurement, but the classic dual-isotope blood volume study is time-consuming and can be prone to technical errors. Moreover, a lack of normal values and a rubric for interpretation made volume status measurement of limited interest to most clinicians other than some hematologists. A new semiautomated system for blood volume analysis is now available and provides highly accurate results for blood volume analysis within only 90 min. The availability of rapid, accurate blood volume analysis has brought about a surge of clinical interest in using blood volume data for clinical management. Blood volume analysis, long a low-volume nuclear medicine study all but abandoned in some laboratories, is poised to enter the clinical mainstream. This article will first present the fundamental principles of fluid balance and the clinical means of volume status assessment. We will then review the indicator dilution technique and how it is used in nuclear medicine blood volume studies. We will present an overview of the new semiautomated blood volume analysis technique, showing how the study is done, how it works, what results are provided, and how those results are interpreted. Finally, we will look at some of the emerging areas in which data from blood volume analysis can improve patient care. The reader will gain an understanding of the principles underlying blood volume assessment, know how current nuclear medicine blood volume analysis studies are performed, and appreciate their potential clinical impact.
Space Shuttle/TDRSS communication and tracking systems analysis
NASA Astrophysics Data System (ADS)
Lindsey, W. C.; Chie, C. M.; Cideciyan, R.; Dessouky, K.; Su, Y. T.; Tsang, C. S.
1986-04-01
In order to evaluate the technical and operational problem areas and provide a recommendation, the enhancements to the Tracking and Data Delay Satellite System (TDRSS) and Shuttle must be evaluated through simulation and analysis. These enhancement techniques must first be characterized, then modeled mathematically, and finally updated into LinCsim (analytical simulation package). The LinCsim package can then be used as an evaluation tool. Three areas of potential enhancements were identified: shuttle payload accommodations, TDRSS SSA and KSA services, and shuttle tracking system and navigation sensors. Recommendations for each area were discussed.
Space Shuttle/TDRSS communication and tracking systems analysis
NASA Technical Reports Server (NTRS)
Lindsey, W. C.; Chie, C. M.; Cideciyan, R.; Dessouky, K.; Su, Y. T.; Tsang, C. S.
1986-01-01
In order to evaluate the technical and operational problem areas and provide a recommendation, the enhancements to the Tracking and Data Delay Satellite System (TDRSS) and Shuttle must be evaluated through simulation and analysis. These enhancement techniques must first be characterized, then modeled mathematically, and finally updated into LinCsim (analytical simulation package). The LinCsim package can then be used as an evaluation tool. Three areas of potential enhancements were identified: shuttle payload accommodations, TDRSS SSA and KSA services, and shuttle tracking system and navigation sensors. Recommendations for each area were discussed.
NASA Technical Reports Server (NTRS)
Wang, T.; Simon, T. W.
1988-01-01
Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.
Advanced technology development multi-color holography
NASA Technical Reports Server (NTRS)
Vikram, Chandra S.
1993-01-01
This is the final report of the Multi-color Holography project. The comprehensive study considers some strategic aspects of multi-color holography. First, various methods of available techniques for accurate fringe counting are reviewed. These are heterodyne interferometry, quasi-heterodyne interferometry, and phase-shifting interferometry. Phase-shifting interferometry was found to be the most suitable for multi-color holography. Details of experimentation with a sugar solution are also reported where better than 1/200 of a fringe order measurement capability was established. Rotating plate glass phase shifter was used for the experimentation. The report then describes the possible role of using more than two wavelengths with special reference-to-object beam intensity ratio needs in multicolor holography. Some specific two- and three-color cases are also described in detail. Then some new analysis methods of the reconstructed wavefront are considered. These are deflectometry, speckle metrology, confocal optical signal processing, and phase shifting technique related applications. Finally, design aspects of an experimental breadboard are presented.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat
2017-01-01
Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688
NASA Astrophysics Data System (ADS)
Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng
2016-09-01
In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.
Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K
2016-07-20
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.
The Application of Hilbert-Huang Transforms to Meteorological Datasets
NASA Technical Reports Server (NTRS)
Duffy, Dean G.
2003-01-01
Recently a new spectral technique as been developed for the analysis of aperiodic and nonlinear signals - the Hilbert-Huang transform. This paper shows how these transforms can be used to discover synoptic and climatic features: For sea level data, the transforms capture the oceanic tides as well as large, aperiodic river outflows. In the case of solar radiation, we observe variations in the diurnal and seasonal cycles. Finally, from barographic data, the Hilbert-Huang transform reveals the passage of extratropical cyclones, fronts, and troughs. Thus, this technique can flag significant weather events such its a flood or the passage of a squall line.
An analytical approach to test and design upper limb prosthesis.
Veer, Karan
2015-01-01
In this work the signal acquiring technique, the analysis models and the design protocols of the prosthesis are discussed. The different methods to estimate the motion intended by the amputee from surface electromyogram (SEMG) signals based on time and frequency domain parameters are presented. The experiment proposed that the used techniques can help significantly in discriminating the amputee's motions among four independent activities using dual channel set-up. Further, based on experimental results, the design and working of an artificial arm have been covered under two constituents--the electronics design and the mechanical assembly. Finally, the developed hand prosthesis allows the amputated persons to perform daily routine activities easily.
Proteomics: a new approach to the study of disease.
Chambers, G; Lawrie, L; Cash, P; Murray, G I
2000-11-01
The global analysis of cellular proteins has recently been termed proteomics and is a key area of research that is developing in the post-genome era. Proteomics uses a combination of sophisticated techniques including two-dimensional (2D) gel electrophoresis, image analysis, mass spectrometry, amino acid sequencing, and bio-informatics to resolve comprehensively, to quantify, and to characterize proteins. The application of proteomics provides major opportunities to elucidate disease mechanisms and to identify new diagnostic markers and therapeutic targets. This review aims to explain briefly the background to proteomics and then to outline proteomic techniques. Applications to the study of human disease conditions ranging from cancer to infectious diseases are reviewed. Finally, possible future advances are briefly considered, especially those which may lead to faster sample throughput and increased sensitivity for the detection of individual proteins. Copyright 2000 John Wiley & Sons, Ltd.
Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations.
Kamran, Faisal; Andersen, Peter E
2015-08-10
Oblique incidence reflectometry has developed into an effective, noncontact, and noninvasive measurement technology for the quantification of both the reduced scattering and absorption coefficients of a sample. The optical properties are deduced by analyzing only the shape of the reflectance profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical properties in which system demands vary to be able to detect subtle changes in the structure of the medium, translated as measured optical properties. Effects of variation in anisotropy are discussed and results presented. Finally, experimental data of milk products with different fat content are considered as examples for comparison.
The application analysis of the multi-angle polarization technique for ocean color remote sensing
NASA Astrophysics Data System (ADS)
Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli
2017-02-01
The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.
Information granules in image histogram analysis.
Wieclawek, Wojciech
2018-04-01
A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis of earth rotation solution from Starlette
NASA Technical Reports Server (NTRS)
Schutz, B. E.; Cheng, M. K.; Shum, C. K.; Eanes, R. J.; Tapley, B. D.
1989-01-01
Earth rotation parameter (ERP) solutions were derived from the Starlette orbit analysis during the Main MERIT Campaign, using a technique of a consider-covariance analysis to assess the effects of errors on the polar motion solutions. The polar motion solution was then improved through the simultaneous adjustment of some dynamical parameters representing identified dominant perturbing sources (such as the geopotential and ocean-tide coefficients) on the polar motion solutions. Finally, an improved ERP solution was derived using the gravity field model, PTCF1, described by Tapley et al. (1986). The accuracy of the Starlette ERP solution was assessed by a comparison with the LAGEOS-derived ERP solutions.
Kabytaev, Kuanysh; Durairaj, Anita; Shin, Dmitriy; Rohlfing, Curt L; Connolly, Shawn; Little, Randie R; Stoyanov, Alexander V
2016-02-01
A liquid chromatography with mass spectrometry on-line platform that includes the orthogonal techniques of ion exchange and reversed phase chromatography is applied for C-peptide analysis. Additional improvement is achieved by the subsequent application of cation- and anion-exchange purification steps that allow for isolating components that have their isoelectric points in a narrow pH range before final reversed-phase mass spectrometry analysis. The utility of this approach for isolating fractions in the desired "pI window" for profiling complex mixtures is discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Survey of New Trends in Symbolic Execution for Software Testing and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Visser, Willem
2009-01-01
Symbolic execution is a well-known program analysis technique which represents values of program inputs with symbolic values instead of concrete (initialized) data and executes the program by manipulating program expressions involving the symbolic values. Symbolic execution has been proposed over three decades ago but recently it has found renewed interest in the research community, due in part to the progress in decision procedures, availability of powerful computers and new algorithmic developments. We provide a survey of some of the new research trends in symbolic execution, with particular emphasis on applications to test generation and program analysis. We first describe an approach that handles complex programming constructs such as input data structures, arrays, as well as multi-threading. We follow with a discussion of abstraction techniques that can be used to limit the (possibly infinite) number of symbolic configurations that need to be analyzed for the symbolic execution of looping programs. Furthermore, we describe recent hybrid techniques that combine concrete and symbolic execution to overcome some of the inherent limitations of symbolic execution, such as handling native code or availability of decision procedures for the application domain. Finally, we give a short survey of interesting new applications, such as predictive testing, invariant inference, program repair, analysis of parallel numerical programs and differential symbolic execution.
ERIC Educational Resources Information Center
Mobray, Deborah, Ed.
Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…
US Army Institute of Dental Research Annual Progress Report FY80.
1980-10-01
indicates that use of the laser technique does result in increased connectiv tissue regeneration and improved resolution of the periodontal defects...connectiv tissue regeneration and improved resolution of the periodontal defects. Final analysis of histologic data on the use of the neodymium laser for in...and 21 Bone Regeneration in Trnumatic Wounds (Pathology) DA OH 6038 Development of Endodontic Procedures for 22 Military Dentistry ( Oral Biology) DA OK
Unfurlable satellite antennas - A review
NASA Technical Reports Server (NTRS)
Roederer, Antoine G.; Rahmat-Samii, Yahia
1989-01-01
A review of unfurlable satellite antennas is presented. Typical application requirements for future space missions are first outlined. Then, U.S. and European mesh and inflatable antenna concepts are described. Precision deployables using rigid panels or petals are not included in the survey. RF modeling and performance analysis of gored or faceted mesh reflector antennas are then reviewed. Finally, both on-ground and in-orbit RF test techniques for large unfurlable antennas are discussed.
Direct numerical simulation of the flow around an aerofoil in ramp-up motion
NASA Astrophysics Data System (ADS)
Rosti, Marco E.; Omidyeganeh, Mohammad; Pinelli, Alfredo
2016-02-01
A detailed analysis of the flow around a NACA0020 aerofoil at Rec = 2 × 104 undergoing a ramp up motion has been carried out by means of direct numerical simulations. During the manoeuvre, the angle of attack is linearly varied in time between 0° and 20° with a constant rate of change of α ˙ rad = 0 . 12 U ∞ / c . When the angle of incidence has reached the final value, the lift experiences a first overshoot and then suddenly decreases towards the static stall asymptotic value. The transient instantaneous flow is dominated by the generation and detachment of the dynamic stall vortex, a large scale structure formed by the merging of smaller scales vortices generated by an instability originating at the trailing edge. New insights on the vorticity dynamics leading to the lift overshoot, lift crisis, and the damped oscillatory cycle that gradually matches the steady condition are discussed using a number of post-processing techniques. These include a detailed analysis of the flow ensemble average statistics and coherent structures identification carried out using the Q -criterion and the finite-time Lyapunov exponent technique. The results are compared with the one obtained in a companion simulation considering a static stall condition at the final angle of incidence α = 20°.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-15
This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in thesemore » appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.« less
Improving the limits of detection of low background alpha emission measurements
NASA Astrophysics Data System (ADS)
McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.
2018-01-01
Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.
NASA Astrophysics Data System (ADS)
Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim
2014-04-01
This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.
Low energy analysis techniques for CUORE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alduino, C.; Alfonso, K.; Artusa, D. R.
CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less
Brain tumor classification using AFM in combination with data mining techniques.
Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt
2013-01-01
Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.
Low energy analysis techniques for CUORE
Alduino, C.; Alfonso, K.; Artusa, D. R.; ...
2017-12-12
CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
Fan, Ya Ju; Kamath, Chandrika
2016-09-01
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Ya Ju; Kamath, Chandrika
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.
2006-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced nondestructive evaluation (NDE) techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.
2007-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter s wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Status of Thermal NDT of Space Shuttle Materials at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Rweinhardt, Walter W.
2006-01-01
Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.
Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing
2018-01-01
Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.
Alduino, C.; Alfonso, K.; Artusa, D. R.; ...
2016-04-25
Here, we describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta (0νββ) decay in 130Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0νββ decay search in its own right and functions as a platform to further develop the analysis tools and procedures tomore » be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0νββ search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0νββ decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0νββ decay half-life limits previously reported for CUORE-0, T 0ν 1/2 > 2.7×10 24yr, and in combination with the Cuoricino limit, T 0ν 1/2 > 4.0×10 24yr.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alduino, C.; Alfonso, K.; Artusa, D. R.
2016-04-25
We describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta ( 0 ν β β ) decay in 130 Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0 ν β β decay search in its own right and functions as a platform tomore » further develop the analysis tools and procedures to be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0 ν β β search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0 ν β β decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0 ν β β decay half-life limits previously reported for CUORE-0, T 0 ν 1 / 2 > 2.7 × 10 24 yr , and in combination with the Cuoricino limit, T 0 ν 1 / 2 > 4.0 × 10 24 yr .« less
Rogge, Matthew D; Leckey, Cara A C
2013-09-01
Delaminations in composite laminates resulting from impact events may be accompanied by minimal indication of damage at the surface. As such, inspections are required to ensure defects are within allowable limits. Conventional ultrasonic scanning techniques have been shown to effectively characterize the size and depth of delaminations but require physical contact with the structure and considerable setup time. Alternatively, a non-contact scanning laser vibrometer may be used to measure guided wave propagation in the laminate structure generated by permanently bonded transducers. A local Fourier domain analysis method is presented for processing guided wavefield data to estimate spatially dependent wavenumber values, which can be used to determine delamination depth. The technique is applied to simulated wavefields and results are analyzed to determine limitations of the technique with regards to determining defect size and depth. Based on simulation results, guidelines for application of the technique are developed. Finally, experimental wavefield data is obtained in quasi-isotropic carbon fiber reinforced polymer (CFRP) laminates with impact damage. The recorded wavefields are analyzed and wavenumber is measured to an accuracy of up to 8.5% in the region of shallow delaminations. These results show the promise of local wavenumber domain analysis to characterize the depth of delamination damage in composite laminates. The technique can find application in automated vehicle health assurance systems with potential for high detection rates and greatly reduced operator effort and setup time. Published by Elsevier B.V.
Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo
2016-03-24
Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Drzymała, Jan; Abramski, Krzysztof M.
2014-07-01
Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards.
Computational Fluid Dynamics Analysis Success Stories of X-Plane Design to Flight Test
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2008-01-01
Examples of the design and flight test of three true X-planes are described, particularly X-plane design techniques that relied heavily on computational fluid dynamics(CFD) analysis. Three examples are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and the X-48B Blended Wing Body Demonstrator Aircraft. An overview is presented of the uses of CFD analysis, comparison and contrast with wind tunnel testing, and information derived from CFD analysis that directly related to successful flight test. Lessons learned on the proper and improper application of CFD analysis are presented. Highlights of the flight-test results of the three example X-planes are presented. This report discusses developing an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the areas in which CFD analysis does and does not perform well during this process is presented. How wind tunnel testing complements, calibrates, and verifies CFD analysis is discussed. Lessons learned revealing circumstances under which CFD analysis results can be misleading are given. Strengths and weaknesses of the various flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed.
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
NASA Astrophysics Data System (ADS)
Esmaeili, Mostafa; Motagh, Mahdi
2016-07-01
Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and flight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the author s personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the author s experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further refined CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of flow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
NASA Technical Reports Server (NTRS)
Cosentino, Gary B.
2007-01-01
Several examples from the past decade of success stories involving the design and ight test of three true X-planes will be described: in particular, X-plane design techniques that relied heavily upon computational fluid dynamics (CFD). Three specific examples chosen from the authors personal experience are presented: the X-36 Tailless Fighter Agility Research Aircraft, the X-45A Unmanned Combat Air Vehicle, and, most recently, the X-48B Blended Wing Body Demonstrator Aircraft. An overview will be presented of the uses of CFD analysis, comparisons and contrasts with wind tunnel testing, and information derived from the CFD analysis that directly related to successful flight test. Some lessons learned on the proper application, and misapplication, of CFD are illustrated. Finally, some highlights of the flight-test results of the three example X-planes will be presented. This overview paper will discuss some of the authors experience with taking an aircraft shape from early concept and three-dimensional modeling through CFD analysis, wind tunnel testing, further re ned CFD analysis, and, finally, flight. An overview of the key roles in which CFD plays well during this process, and some other roles in which it does not, are discussed. How wind tunnel testing complements, calibrates, and verifies CFD analysis is also covered. Lessons learned on where CFD results can be misleading are also given. Strengths and weaknesses of the various types of ow solvers, including panel methods, Euler, and Navier-Stokes techniques, are discussed. The paper concludes with the three specific examples, including some flight test video footage of the X-36, the X-45A, and the X-48B.
Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis
NASA Astrophysics Data System (ADS)
Hussain, T.; Gondal, M. A.
2013-06-01
Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.
Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J
2014-02-01
Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model (Markov) that needs the parameterization of transition probabilities, and only has summary KM plots available.
A general numerical model for wave rotor analysis
NASA Technical Reports Server (NTRS)
Paxson, Daniel W.
1992-01-01
Wave rotors represent one of the promising technologies for achieving very high core temperatures and pressures in future gas turbine engines. Their operation depends upon unsteady gas dynamics and as such, their analysis is quite difficult. This report describes a numerical model which has been developed to perform such an analysis. Following a brief introduction, a summary of the wave rotor concept is given. The governing equations are then presented, along with a summary of the assumptions used to obtain them. Next, the numerical integration technique is described. This is an explicit finite volume technique based on the method of Roe. The discussion then focuses on the implementation of appropriate boundary conditions. Following this, some results are presented which first compare the numerical approximation to the governing differential equations and then compare the overall model to an actual wave rotor experiment. Finally, some concluding remarks are presented concerning the limitations of the simplifying assumptions and areas where the model may be improved.
Alignment of an acoustic manipulation device with cepstral analysis of electronic impedance data.
Hughes, D A; Qiu, Y; Démoré, C; Weijer, C J; Cochran, S
2015-02-01
Acoustic particle manipulation is an emerging technology that uses ultrasonic standing waves to position objects with pressure gradients and acoustic radiation forces. To produce strong standing waves, the transducer and the reflector must be aligned properly such that they are parallel to each other. This can be a difficult process due to the need to visualise the ultrasound waves and as higher frequencies are introduced, this alignment requires higher accuracy. In this paper, we present a method for aligning acoustic resonators with cepstral analysis. This is a simple signal processing technique that requires only the electrical impedance measurement data of the resonator, which is usually recorded during the fabrication process of the device. We first introduce the mathematical basis of cepstral analysis and then demonstrate and validate it using a computer simulation of an acoustic resonator. Finally, the technique is demonstrated experimentally to create many parallel linear traps for 10 μm fluorescent beads inside an acoustic resonator. Copyright © 2014 Elsevier B.V. All rights reserved.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Criminal profiling as expert witness evidence: The implications of the profiler validity research.
Kocsis, Richard N; Palermo, George B
The use and development of the investigative tool colloquially known as criminal profiling has steadily increased over the past five decades throughout the world. Coupled with this growth has been a diversification in the suggested range of applications for this technique. Possibly the most notable of these has been the attempted transition of the technique from a tool intended to assist police investigations into a form of expert witness evidence admissible in legal proceedings. Whilst case law in various jurisdictions has considered with mutual disinclination the evidentiary admissibility of criminal profiling, a disjunction has evolved between these judicial examinations and the scientifically vetted research testing the accuracy (i.e., validity) of the technique. This article offers an analysis of the research directly testing the validity of the criminal profiling technique and the extant legal principles considering its evidentiary admissibility. This analysis reveals that research findings concerning the validity of criminal profiling are surprisingly compatible with the extant legal principles. The overall conclusion is that a discrete form of crime behavioural analysis is supported by the profiler validity research and could be regarded as potentially admissible expert witness evidence. Finally, a number of theoretical connections are also identified concerning the skills and qualifications of individuals who may feasibly provide such expert testimony. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Deng, Chengbin; Wu, Changshan
2013-12-01
Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.
Ajmera, Puneeta
2017-10-09
Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Lee, Jeong Wan
2008-01-01
This paper proposes a field calibration technique for aligning a wind direction sensor to the true north. The proposed technique uses the synchronized measurements of captured images by a camera, and the output voltage of a wind direction sensor. The true wind direction was evaluated through image processing techniques using the captured picture of the sensor with the least square sense. Then, the evaluated true value was compared with the measured output voltage of the sensor. This technique solves the discordance problem of the wind direction sensor in the process of installing meteorological mast. For this proposed technique, some uncertainty analyses are presented and the calibration accuracy is discussed. Finally, the proposed technique was applied to the real meteorological mast at the Daegwanryung test site, and the statistical analysis of the experimental testing estimated the values of stable misalignment and uncertainty level. In a strict sense, it is confirmed that the error range of the misalignment from the exact north could be expected to decrease within the credibility level. PMID:27873957
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.
1991-01-01
The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.
Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique
NASA Astrophysics Data System (ADS)
Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.
2015-12-01
Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.
Convergence behavior of delayed discrete cellular neural network without periodic coefficients.
Wang, Jinling; Jiang, Haijun; Hu, Cheng; Ma, Tianlong
2014-05-01
In this paper, we study convergence behaviors of delayed discrete cellular neural networks without periodic coefficients. Some sufficient conditions are derived to ensure all solutions of delayed discrete cellular neural network without periodic coefficients converge to a periodic function, by applying mathematical analysis techniques and the properties of inequalities. Finally, some examples showing the effectiveness of the provided criterion are given. Copyright © 2014 Elsevier Ltd. All rights reserved.
Flow Field Analysis of Fully Coupled Computations of a Flexible Wing undergoing Stall Flutter
2016-01-01
unsteady aerodynamic loads due to structural displacements. In terms of actuation , most, if not all, active ∗Research Associate, Department of...flutter suppression techniques, conventional trailing edge flap actuators with a bandwidth of 10 Hz5 was used. Interestingly, the frequencies associated...influence of the flow features on the aeroelastic instability are quantified. Finally, the influence of actuation through a blowing port at 75% span is
2015-03-10
AFRL-OSR-VA-TR-2015-0080 Biosensing and Bioprocessing Devices in Living Cells Domitilla Del Vecchio MASSACHUSETTS INSTITUTE OF TECHNOLOGY Final...Of Biosensing And Bioprocessing Devices In Living Cells FA9550-12-1-0129 D. Del Vecchio Massachusetts Institute of Technology -- 77 Massachusetts...research is to develop quantitative techniques for the de novo design and fabrication of biosensing devices in living cells . Such devices will be entirely
NASA Astrophysics Data System (ADS)
Ctvrtnickova, T.; Mateo, M. P.; Yañez, A.; Nicolas, G.
2011-04-01
Presented work brings results of Laser-Induced Breakdown Spectroscopy (LIBS) and Thermo-Mechanical Analysis (TMA) of coals and coal blends used in coal fired power plants all over Spain. Several coal specimens, its blends and corresponding laboratory ash were analyzed by mentioned techniques and results were compared to standard laboratory methods. The indices of slagging, which predict the tendency of coal ash deposition on the boiler walls, were determined by means of standard chemical analysis, LIBS and TMA. The optimal coal suitable to be blended with the problematic national lignite coal was suggested in order to diminish the slagging problems. Used techniques were evaluated based on the precision, acquisition time, extension and quality of information they could provide. Finally, the applicability of LIBS and TMA to the successful calculation of slagging indices is discussed and their substitution of time-consuming and instrumentally difficult standard methods is considered.
Lorenzo-Seva, Urbano; Ferrando, Pere J
2011-03-01
We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.
Cortijo, Sandra; Charoensawan, Varodom; Roudier, François; Wigge, Philip A
2018-01-01
Chromatin immunoprecipitation combined with next-generation sequencing (ChIP-seq) is a powerful technique to investigate in vivo transcription factor (TF) binding to DNA, as well as chromatin marks. Here we provide a detailed protocol for all the key steps to perform ChIP-seq in Arabidopsis thaliana roots, also working on other A. thaliana tissues and in most non-ligneous plants. We detail all steps from material collection, fixation, chromatin preparation, immunoprecipitation, library preparation, and finally computational analysis based on a combination of publicly available tools.
A relativistic analysis of clock synchronization
NASA Technical Reports Server (NTRS)
Thomas, J. B.
1974-01-01
The relativistic conversion between coordinate time and atomic time is reformulated to allow simpler time calculations relating analysis in solar-system barycentric coordinates (using coordinate time) with earth-fixed observations (measuring earth-bound proper time or atomic time.) After an interpretation of terms, this simplified formulation, which has a rate accuracy of about 10 to the minus 15th power, is used to explain the conventions required in the synchronization of a world wide clock network and to analyze two synchronization techniques-portable clocks and radio interferometry. Finally, pertinent experiment tests of relativity are briefly discussed in terms of the reformulated time conversion.
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
Antennas in matter: Fundamentals, theory, and applications
NASA Technical Reports Server (NTRS)
King, R. W. P.; Smith, G. S.; Owens, M.; Wu, T. T.
1981-01-01
The volume provides an introduction to antennas and probes embedded within or near material bodies such as the earth, the ocean, or a living organism. After a fundamental analysis of insulated and bare antennas, an advanced treatment of antennas in various media is presented, including a detailed study of the electromagnetic equations in homogeneous isotropic media, the complete theory of the bare dipole in a general medium, and a rigorous analysis of the insulated antenna as well as bare and insulated loop antennas. Finally, experimental models and measuring techniques related to antennas and probes in a general dissipative or dielectric medium are examined.
Respiratory protective device design using control system techniques
NASA Technical Reports Server (NTRS)
Burgess, W. A.; Yankovich, D.
1972-01-01
The feasibility of a control system analysis approach to provide a design base for respiratory protective devices is considered. A system design approach requires that all functions and components of the system be mathematically identified in a model of the RPD. The mathematical notations describe the operation of the components as closely as possible. The individual component mathematical descriptions are then combined to describe the complete RPD. Finally, analysis of the mathematical notation by control system theory is used to derive compensating component values that force the system to operate in a stable and predictable manner.
Risk Management of NASA Projects
NASA Technical Reports Server (NTRS)
Sarper, Hueseyin
1997-01-01
Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.
NASA Astrophysics Data System (ADS)
Niccolini, Gianni; Manuello, Amedeo; Marchis, Elena; Carpinteri, Alberto
2017-07-01
The stability of an arch as a structural element in the thermal bath of King Charles Albert (Carlo Alberto) in the Royal Castle of Racconigi (on the UNESCO World Heritage List since 1997) was assessed by the acoustic emission (AE) monitoring technique with application of classical inversion methods to recorded AE data. First, damage source location by means of triangulation techniques and signal frequency analysis were carried out. Then, the recently introduced method of natural-time analysis was preliminarily applied to the AE time series in order to reveal a possible entrance point to a critical state of the monitored structural element. Finally, possible influence of the local seismic and microseismic activity on the stability of the monitored structure was investigated. The criterion for selecting relevant earthquakes was based on the estimation of the size of earthquake preparation zones. The presented results suggest the use of the AE technique as a tool for detecting both ongoing structural damage processes and microseismic activity during preparation stages of seismic events.
Quantifying short-lived events in multistate ionic current measurements.
Balijepalli, Arvind; Ettedgui, Jessica; Cornio, Andrew T; Robertson, Joseph W F; Cheung, Kin P; Kasianowicz, John J; Vaz, Canute
2014-02-25
We developed a generalized technique to characterize polymer-nanopore interactions via single channel ionic current measurements. Physical interactions between analytes, such as DNA, proteins, or synthetic polymers, and a nanopore cause multiple discrete states in the current. We modeled the transitions of the current to individual states with an equivalent electrical circuit, which allowed us to describe the system response. This enabled the estimation of short-lived states that are presently not characterized by existing analysis techniques. Our approach considerably improves the range and resolution of single-molecule characterization with nanopores. For example, we characterized the residence times of synthetic polymers that are three times shorter than those estimated with existing algorithms. Because the molecule's residence time follows an exponential distribution, we recover nearly 20-fold more events per unit time that can be used for analysis. Furthermore, the measurement range was extended from 11 monomers to as few as 8. Finally, we applied this technique to recover a known sequence of single-stranded DNA from previously published ion channel recordings, identifying discrete current states with subpicoampere resolution.
Extending the knowledge in histochemistry and cell biology.
Heupel, Wolfgang-Moritz; Drenckhahn, Detlev
2010-01-01
Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.
Diagnostics and Active Control of Aircraft Interior Noise
NASA Technical Reports Server (NTRS)
Fuller, C. R.
1998-01-01
This project deals with developing advanced methods for investigating and controlling interior noise in aircraft. The work concentrates on developing and applying the techniques of Near Field Acoustic Holography (NAH) and Principal Component Analysis (PCA) to the aircraft interior noise dynamic problem. This involves investigating the current state of the art, developing new techniques and then applying them to the particular problem being studied. The knowledge gained under the first part of the project was then used to develop and apply new, advanced noise control techniques for reducing interior noise. A new fully active control approach based on the PCA was developed and implemented on a test cylinder. Finally an active-passive approach based on tunable vibration absorbers was to be developed and analytically applied to a range of test structures from simple plates to aircraft fuselages.
Verification and extension of the MBL technique for photo resist pattern shape measurement
NASA Astrophysics Data System (ADS)
Isawa, Miki; Tanaka, Maki; Kazumi, Hideyuki; Shishido, Chie; Hamamatsu, Akira; Hasegawa, Norio; De Bisschop, Peter; Laidler, David; Leray, Philippe; Cheng, Shaunee
2011-03-01
In order to achieve pattern shape measurement with CD-SEM, the Model Based Library (MBL) technique is in the process of development. In this study, several libraries which consisted by double trapezoid model placed in optimum layout, were used to measure the various layout patterns. In order to verify the accuracy of the MBL photoresist pattern shape measurement, CDAFM measurements were carried out as a reference metrology. Both results were compared to each other, and we confirmed that there is a linear correlation between them. After that, to expand the application field of the MBL technique, it was applied to end-of-line (EOL) shape measurement to show the capability. Finally, we confirmed the possibility that the MBL could be applied to more local area shape measurement like hot-spot analysis.
Carriles, Ramón; Schafer, Dawn N.; Sheetz, Kraig E.; Field, Jeffrey J.; Cisek, Richard; Barzda, Virginijus; Sylvester, Anne W.; Squier, Jeffrey A.
2009-01-01
We review the current state of multiphoton microscopy. In particular, the requirements and limitations associated with high-speed multiphoton imaging are considered. A description of the different scanning technologies such as line scan, multifoci approaches, multidepth microscopy, and novel detection techniques is given. The main nonlinear optical contrast mechanisms employed in microscopy are reviewed, namely, multiphoton excitation fluorescence, second harmonic generation, and third harmonic generation. Techniques for optimizing these nonlinear mechanisms through a careful measurement of the spatial and temporal characteristics of the focal volume are discussed, and a brief summary of photobleaching effects is provided. Finally, we consider three new applications of multiphoton microscopy: nonlinear imaging in microfluidics as applied to chemical analysis and the use of two-photon absorption and self-phase modulation as contrast mechanisms applied to imaging problems in the medical sciences. PMID:19725639
Exploring the Spatiotemporal Organization of Membrane Proteins in Living Plant Cells.
Wang, Li; Xue, Yiqun; Xing, Jingjing; Song, Kai; Lin, Jinxing
2018-04-29
Plasma membrane proteins have important roles in transport and signal transduction. Deciphering the spatiotemporal organization of these proteins provides crucial information for elucidating the links between the behaviors of different molecules. However, monitoring membrane proteins without disrupting their membrane environment remains difficult. Over the past decade, many studies have developed single-molecule techniques, opening avenues for probing the stoichiometry and interactions of membrane proteins in their native environment by providing nanometer-scale spatial information and nanosecond-scale temporal information. In this review, we assess recent progress in the development of labeling and imaging technology for membrane protein analysis. We focus in particular on several single-molecule techniques for quantifying the dynamics and assembly of membrane proteins. Finally, we provide examples of how these new techniques are advancing our understanding of the complex biological functions of membrane proteins.
Application of a system modification technique to dynamic tuning of a spinning rotor blade
NASA Technical Reports Server (NTRS)
Spain, C. V.
1987-01-01
An important consideration in the development of modern helicopters is the vibratory response of the main rotor blade. One way to minimize vibration levels is to ensure that natural frequencies of the spinning main rotor blade are well removed from integer multiples of the rotor speed. A technique for dynamically tuning a finite-element model of a rotor blade to accomplish that end is demonstrated. A brief overview is given of the general purpose finite element system known as Engineering Analysis Language (EAL) which was used in this work. A description of the EAL System Modification (SM) processor is then given along with an explanation of special algorithms developed to be used in conjunction with SM. Finally, this technique is demonstrated by dynamically tuning a model of an advanced composite rotor blade.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot-induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Prediction of aircraft handling qualities using analytical models of the human pilot
NASA Technical Reports Server (NTRS)
Hess, R. A.
1982-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion for determining the susceptibility of an aircraft to pilot induced oscillations is formulated. Finally, a model based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Techniques for shuttle trajectory optimization
NASA Technical Reports Server (NTRS)
Edge, E. R.; Shieh, C. J.; Powers, W. F.
1973-01-01
The application of recently developed function-space Davidon-type techniques to the shuttle ascent trajectory optimization problem is discussed along with an investigation of the recently developed PRAXIS algorithm for parameter optimization. At the outset of this analysis, the major deficiency of the function-space algorithms was their potential storage problems. Since most previous analyses of the methods were with relatively low-dimension problems, no storage problems were encountered. However, in shuttle trajectory optimization, storage is a problem, and this problem was handled efficiently. Topics discussed include: the shuttle ascent model and the development of the particular optimization equations; the function-space algorithms; the operation of the algorithm and typical simulations; variable final-time problem considerations; and a modification of Powell's algorithm.
An analytical approach for predicting pilot induced oscillations
NASA Technical Reports Server (NTRS)
Hess, R. A.
1981-01-01
The optimal control model (OCM) of the human pilot is applied to the study of aircraft handling qualities. Attention is focused primarily on longitudinal tasks. The modeling technique differs from previous applications of the OCM in that considerable effort is expended in simplifying the pilot/vehicle analysis. After briefly reviewing the OCM, a technique for modeling the pilot controlling higher order systems is introduced. Following this, a simple criterion or determining the susceptability of an aircraft to pilot induced oscillations (PIO) is formulated. Finally, a model-based metric for pilot rating prediction is discussed. The resulting modeling procedure provides a relatively simple, yet unified approach to the study of a variety of handling qualities problems.
Cost-Effectiveness Research in Neurosurgery: We Can and We Must.
Stein, Sherman C
2018-01-05
Rapid advancement of medical and surgical therapies, coupled with the recent preoccupation with limiting healthcare costs, makes a collision of the 2 objectives imminent. This article explains the value of cost-effectiveness analysis (CEA) in reconciling the 2 competing goals, and provides a brief introduction to evidence-based CEA techniques. The historical role of CEA in determining whether new neurosurgical strategies provide value for cost is summarized briefly, as are the limitations of the technique. Finally, the unique ability of the neurosurgical community to provide input to the CEA process is emphasized, as are the potential risks of leaving these important decisions in the hands of others. Copyright © 2018 by the Congress of Neurological Surgeons.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors
Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.
2016-01-01
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643
Neurophotonics: non-invasive optical techniques for monitoring brain functions
Torricelli, Alessandro; Contini, Davide; Mora, Alberto Dalla; Pifferi, Antonio; Re, Rebecca; Zucchelli, Lucia; Caffini, Matteo; Farina, Andrea; Spinelli, Lorenzo
2014-01-01
Summary The aim of this review is to present the state of the art of neurophotonics, a recently founded discipline lying at the interface between optics and neuroscience. While neurophotonics also includes invasive techniques for animal studies, in this review we focus only on the non-invasive methods that use near infrared light to probe functional activity in the brain, namely the fast optical signal, diffuse correlation spectroscopy, and functional near infrared spectroscopy methods. We also present an overview of the physical principles of light propagation in biological tissues, and of the main physiological sources of signal. Finally, we discuss the open issues in models, instrumentation, data analysis and clinical approaches. PMID:25764252
The Japan Society for Innovative Cuisine: Exploring New Visions of Japanese Cuisine.
Yamazaki, Hanae; Fushiki, Tohru
2015-01-01
Kyoto cuisine has a long history and its traditions have been practiced for hundreds of years. In Kyoto, a group of scientists and renowned chefs strives to better understand traditional Kyoto cuisine in order to foster culinary innovation within traditional Kyoto cuisine. We launched a research project in April 2009 using a specially equipped "laboratory-kitchen" located in Kyoto University. Chefs chose a variety of topics related to basic concepts and techniques for cooking. We conducted culinary experimentation, thorough analysis, and diligent discussion on each topic for approximately 6 mo. In the symposium, chefs will present the results of their experiments, discussing their techniques and bringing samples of final products.
Laufer, Shlomi; D'Angelo, Anne-Lise D; Kwan, Calvin; Ray, Rebbeca D; Yudkowsky, Rachel; Boulet, John R; McGaghie, William C; Pugh, Carla M
2017-12-01
Develop new performance evaluation standards for the clinical breast examination (CBE). There are several, technical aspects of a proper CBE. Our recent work discovered a significant, linear relationship between palpation force and CBE accuracy. This article investigates the relationship between other technical aspects of the CBE and accuracy. This performance assessment study involved data collection from physicians (n = 553) attending 3 different clinical meetings between 2013 and 2014: American Society of Breast Surgeons, American Academy of Family Physicians, and American College of Obstetricians and Gynecologists. Four, previously validated, sensor-enabled breast models were used for clinical skills assessment. Models A and B had solitary, superficial, 2 cm and 1 cm soft masses, respectively. Models C and D had solitary, deep, 2 cm hard and moderately firm masses, respectively. Finger movements (search technique) from 1137 CBE video recordings were independently classified by 2 observers. Final classifications were compared with CBE accuracy. Accuracy rates were model A = 99.6%, model B = 89.7%, model C = 75%, and model D = 60%. Final classification categories for search technique included rubbing movement, vertical movement, piano fingers, and other. Interrater reliability was (k = 0.79). Rubbing movement was 4 times more likely to yield an accurate assessment (odds ratio 3.81, P < 0.001) compared with vertical movement and piano fingers. Piano fingers had the highest failure rate (36.5%). Regression analysis of search pattern, search technique, palpation force, examination time, and 6 demographic variables, revealed that search technique independently and significantly affected CBE accuracy (P < 0.001). Our results support measurement and classification of CBE techniques and provide the foundation for a new paradigm in teaching and assessing hands-on clinical skills. The newly described piano fingers palpation technique was noted to have unusually high failure rates. Medical educators should be aware of the potential differences in effectiveness for various CBE techniques.
Michigan Physicians' Conference on Elder Abuse. Final Report.
ERIC Educational Resources Information Center
Sengstock, Mary C.; O'Brien, James G.
The final report describes the Michigan Physicians' Conference on Elder Abuse project. The project conference had four major content areas, including: a general introduction to the problem of elder abuse; clinical symptoms of abuse; legal issues; and referral and case management techniques. Training techniques included lectures, group discussion,…
Guarnieri, Adriano; Moreno-Montañés, Javier; Sabater, Alfonso L; Gosende-Chico, Inmaculada; Bonet-Farriol, Elvira
2013-11-01
To analyze the changes in incision sizes after implantation of a toric intraocular lens (IOL) using 2 methods. Department of Ophthalmology, Clínica Universidad de Navarra, Pamplona, Spain. Prospective case series. Coaxial phacoemulsification and IOL implantation through a 2.2 mm clear corneal incision using a cartridge injector were performed. Wound-assisted or cartridge-insertion techniques were used to implant the IOLs. The results were analyzed according to IOL spherical and cylindrical powers. Corneal hysteresis (CH) and the corneal resistance factor (CRF) were measured and evaluated based on the changes in incision size. Incision size increased in 30 (41.7%) of 72 eyes in the wound-assisted group and 71 (98.6%) of 72 eyes in the cartridge-insertion group. The mean incision size after IOL implantation was 2.27 mm ± 0.06 (SD) and 2.37 ± 0.05 mm, respectively (P<.01). The final incision size and IOL spherical power in the wound-assisted technique group (P=.02) and the cartridge-insertion technique group (P=.03) were correlated significantly; IOL toricity was not (P=.19 and P=.28, respectively). The CH and CRF values were not correlated with the final incision size. The final incision size and the changes in incision size after IOL implantation were greater with the cartridge-insertion technique than with the wound-assisted technique. The increase was related to IOL spherical power in both groups but not to IOL toricity. Corneal biomechanical properties were not correlated with the final incision size. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong
2017-08-03
Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0.05, meaning that patients with different risk groups can be separated significantly and those who would need re-intervention can be correctly predicted. The proposed approach will save time and effort made by physicians to collect unnecessary variables. The final reduced model was able to predict the long-term risk of aortic complications after EVAR. This predictive model can help clinicians decide patients' future observation plan.
Effective visibility analysis method in virtual geographic environment
NASA Astrophysics Data System (ADS)
Li, Yi; Zhu, Qing; Gong, Jianhua
2008-10-01
Visibility analysis in virtual geographic environment has broad applications in many aspects in social life. But in practical use it is urged to improve the efficiency and accuracy, as well as to consider human vision restriction. The paper firstly introduces a high-efficient 3D data modeling method, which generates and organizes 3D data model using R-tree and LOD techniques. Then a new visibility algorithm which can realize real-time viewshed calculation considering the shelter of DEM and 3D building models and some restrictions of human eye to the viewshed generation. Finally an experiment is conducted to prove the visibility analysis calculation quickly and accurately which can meet the demand of digital city applications.
Improvements to the gridding of precipitation data across Europe under the E-OBS scheme
NASA Astrophysics Data System (ADS)
Cornes, Richard; van den Besselaar, Else; Jones, Phil; van der Schrier, Gerard; Verver, Ge
2016-04-01
Gridded precipitation data are a valuable resource for analyzing past variations and trends in the hydroclimate. Such data also provide a reference against which model simulations may be driven, compared and/or adjusted. The E-OBS precipitation dataset is widely used for such analyses across Europe, and is particularly valuable since it provides a spatially complete, daily field across the European domain. In this analysis, improvements to the E-OBS precipitation dataset will be presented that aim to provide a more reliable estimate of grid-box precipitation values, particularly in mountainous areas and in regions with a relative sparsity of input station data. The established three-stage E-OBS gridding scheme is retained, whereby monthly precipitation totals are gridded using a thin-plate spline; daily anomalies are gridded using indicator kriging; and the final dataset is produced by multiplying the two grids. The current analysis focuses on improving the monthly thin-plate spline, which has overall control on the final daily dataset. The results from different techniques are compared and the influence on the final daily data is assessed by comparing the data against gridded country-wide datasets produced by various National Meteorological Services
A novel feature ranking method for prediction of cancer stages using proteomics data
Saghapour, Ehsan; Sehhati, Mohammadreza
2017-01-01
Proteomic analysis of cancers' stages has provided new opportunities for the development of novel, highly sensitive diagnostic tools which helps early detection of cancer. This paper introduces a new feature ranking approach called FRMT. FRMT is based on the Technique for Order of Preference by Similarity to Ideal Solution method (TOPSIS) which select the most discriminative proteins from proteomics data for cancer staging. In this approach, outcomes of 10 feature selection techniques were combined by TOPSIS method, to select the final discriminative proteins from seven different proteomic databases of protein expression profiles. In the proposed workflow, feature selection methods and protein expressions have been considered as criteria and alternatives in TOPSIS, respectively. The proposed method is tested on seven various classifier models in a 10-fold cross validation procedure that repeated 30 times on the seven cancer datasets. The obtained results proved the higher stability and superior classification performance of method in comparison with other methods, and it is less sensitive to the applied classifier. Moreover, the final introduced proteins are informative and have the potential for application in the real medical practice. PMID:28934234
Final Report 2007: DOE-FG02-87ER60561
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilbourn, Michael R
2007-04-26
This project involved a multi-faceted approach to the improvement of techniques used in Positron Emission Tomography (PET), from radiochemistry to image processing and data analysis. New methods for radiochemical syntheses were examined, new radiochemicals prepared for evaluation and eventual use in human PET studies, and new pre-clinical methods examined for validation of biochemical parameters in animal studies. The value of small animal PET imaging in measuring small changes of in vivo biochemistry was examined and directly compared to traditional tissue sampling techniques. In human imaging studies, the ability to perform single experimental sessions utilizing two overlapping injections of radiopharmaceuticals wasmore » tested, and it was shown that valid biochemical measures for both radiotracers can be obtained through careful pharmacokinetic modeling of the PET emission data. Finally, improvements in reconstruction algorithms for PET data from small animal PET scanners was realized and these have been implemented in commercial releases. Together, the project represented an integrated effort to improve and extend all basic science aspects of PET imaging at both the animal and human level.« less
Möller, Mecker G; Lugo-Baruqui, Jose Alejandro; Milikowski, Clara; Salgado, Christopher J
2014-04-01
Extramammary Paget's disease (EMPD) is an adenocarcinoma of the apocrine glands with unknown exact prevalence and obscure etiology. It has been divided into primary EMPD and secondary EMPD, in which an internal malignancy is usually associated. Treatment for primary EMPD usually consists of wide lesion excision with negative margins. Multiple methods have been proposed to obtain free-margin status of the disease. These include visible border lesion excision, punch biopsies, and micrographic and frozen-section surgery, with different results but still high recurrence rates. The investigators propose a method consisting of a staged contoured marginal excision using "en face" permanent pathologic analysis preceding the steps of central excision of the lesion and the final reconstruction of the surgical defect. Advantages of this method include adequate margin control allowing final reconstruction and tissue preservation, while minimizing patient discomfort. The staged contoured marginal and central excision technique offers a new alternative to the armamentarium for surgical oncologists for the management of EMPD in which margin control is imperative for control of recurrence rates. Copyright © 2014 Elsevier Inc. All rights reserved.
Akcay, Merve; Arslan, Hakan; Mese, Merve; Durmus, Nazlı; Capar, Ismail Davut
2017-09-01
The aim of this in vitro study was to evaluate the efficacy of different irrigation techniques including laser-activated irrigation using an erbium:yttrium-aluminum-garnet (Er:YAG) laser with a novel tip design (photon-induced photoacoustic streaming (PIPS)), Er:YAG laser with Preciso tip, sonic activation, and passive ultrasonic activation on the final irrigation solution penetration into dentinal tubules by using a laser scanning confocal microscope. In this study, 65 extracted single-rooted human mandibular premolars were instrumented up to size 40 and randomly divided into 5 groups (n = 13) based on the activation technique of the final irrigation solution as follows: conventional irrigation (control group), sonic activation, passive ultrasonic activation, Er:YAG-PIPS tip activation, and Er:YAG-Preciso tip activation. In each group, 5 mL of 5% NaOCl labeled with fluorescent dye was used during the activation as the final irrigation solution. Specimens were sectioned at 2.5 and 8 mm from the apex and then examined under a confocal microscope to calculate the dentinal tubule penetration area. Data were analyzed using two-way analysis of variance (ANOVA) and Tukey's post hoc tests (P = 0.05). Both Er:YAG laser (Preciso/PIPS) activations exhibited a significantly higher penetration area than the other groups (P < 0.05). Additionally, passive ultrasonic activation had significantly higher penetration than the sonic activation group and the control group. Statistically significant differences were also found between each root canal third (coronal > middle > apical) (P < 0.001). The results from the present study support the use of Er:YAG laser activation (Preciso/PIPS) to improve the effectiveness of the final irrigation procedure by increasing the irrigant penetration area into the dentinal tubules. The activation of the irrigant and the creation of the streaming with the Er:YAG laser have a positive effect on the irrigant penetration.
Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan
2015-06-01
Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.
Incorporating an ERP Project into Undergraduate Instruction
Nyhus, Erika; Curtis, Nancy
2016-01-01
Electroencephalogram (EEG) is a relatively non-invasive, simple technique, and recent advances in open source analysis tools make it feasible to implement EEG as a component in undergraduate neuroscience curriculum. We have successfully led students to design novel experiments, record EEG data, and analyze event-related potentials (ERPs) during a one-semester laboratory course for undergraduates in cognitive neuroscience. First, students learned how to set up an EEG recording and completed an analysis tutorial. Students then learned how to set up a novel EEG experiment; briefly, they formed groups of four and designed an EEG experiment on a topic of their choice. Over the course of two weeks students collected behavioral and EEG data. Each group then analyzed their behavioral and ERP data and presented their results both as a presentation and as a final paper. Upon completion of the group project students reported a deeper understanding of cognitive neuroscience methods and a greater appreciation for the strengths and weaknesses of the EEG technique. Although recent advances in open source software made this project possible, it also required access to EEG recording equipment and proprietary software. Future efforts should be directed at making publicly available datasets to learn ERP analysis techniques and making publicly available EEG recording and analysis software to increase the accessibility of hands-on research experience in undergraduate cognitive neuroscience laboratory courses. PMID:27385925
Does player time-in-game affect tackle technique in elite level rugby union?
Tierney, Gregory J; Denvir, Karl; Farrell, Garreth; Simms, Ciaran K
2018-02-01
It has been hypothesised that fatigue may be a major factor in tackle-related injury risk in rugby union and hence more injuries occur in the later stages of a game. The aim of this study is to identify changes in ball carrier or tackler proficiency characteristics, using elite level match video data, as player time-in-game increases. Qualitative observational cohort study. Three 2014/15 European Rugby Champions Cup games were selected for ball carrier and tackler proficiency analysis. Analysis was only conducted on players who started and remained on the field for the entire game. A separate analysis was conducted on 10 randomly selected 2014/15 European Rugby Champions Cup/Pro 12 games to assess the time distribution of tackles throughout a game. A Chi-square test and one-way way ANOVA with post-hoc testing was conducted to identify significant differences (p<0.05) for proficiency characteristics and tackle counts between quarters in the game, respectively. Player time-in-game did not affect tackle proficiency for both the ball carrier and tackler. Any results that showed statistical significance did not indicate a trend of deterioration in proficiency with increased player time-in-game. The time distribution of tackles analysis indicated that more tackles occurring in the final quarter of the game than the first (p=0.04) and second (p=<0.01). It appears that player time-in-game does not affect tackler or ball carrier tackle technique proficiency at the elite level. More tackles occurring in the final quarter of a game provides an alternative explanation to more tackle-related injuries occurring at this stage. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Orbiter/payload proximity operations: Lateral approach technique
NASA Technical Reports Server (NTRS)
Bell, J. A.; Jones, H. L.; Mcadoo, S. F.
1977-01-01
The lateral approach is presented for proximity operations associated with the retrieval of free flying payloads. An out of plane final approach emphasizing onboard software support is recommended for all except the latter segment of the final approach in which manual control is considered mandatory. An overall assessment of various candidate proximity operations techniques are made.
100 Most Influential Publications in Scoliosis Surgery.
Zhou, James Jun; Koltz, Michael T; Agarwal, Nitin; Tempel, Zachary J; Kanter, Adam S; Okonkwo, David O; Hamilton, D Kojo
2017-03-01
Bibliometric analysis. To apply the established technique of citation analysis to identify the 100 most influential articles in scoliosis surgery research published between 1900 and 2015. Previous studies have applied the technique of citation analysis to other areas of study. This is the first article to apply this technique to the field of scoliosis surgery. A two-step search of the Thomson Reuters Web of Science was conducted to identify all articles relevant to the field of scoliosis surgery. The top 100 articles with the most citations were identified based on analysis of titles and abstracts. Further statistical analysis was conducted to determine whether measures of author reputation and overall publication influence affected the rate at which publications were recognized and incorporated by other researchers in the field. Total citations for the final 100 publications included in the list ranged from 82 to 509. The period for publication ranged from 1954 to 2010. Most studies were published in the journal Spine (n = 63). The most frequently published topics of study were surgical techniques (n = 35) and outcomes (n = 35). Measures of author reputation (number of total studies in the top 100, number of first-author studies in the top 100) were found to have no effect on the rate at which studies were adopted by other researchers (number of years until first citation, and number of years until maximum citations). The number of citations/year a publication received was found to be negatively correlated with the rate at which it was adopted by other researchers, indicating that more influential manuscripts attained more rapid recognition by the scientific community at large. In assembling this publication, we have strived to identify and recognize the 100 most influential articles in scoliosis surgery research from 1900 to 2015. N/A.
Structural Path Analysis of Fossil Fuel Based CO2 Emissions: A Case Study for China.
Yang, Zhiyong; Dong, Wenjie; Xiu, Jinfeng; Dai, Rufeng; Chou, Jieming
2015-01-01
Environmentally extended input-output analysis (EEIOA) has long been used to quantify global and regional environmental impacts and to clarify emission transfers. Structural path analysis (SPA), a technique based on EEIOA, is especially useful for measuring significant flows in this environmental-economic system. This paper constructs an imports-adjusted single-region input-output (SRIO) model considering only domestic final use elements, and it uses the SPA technique to highlight crucial routes along the production chain in both final use and sectoral perspectives. The results indicate that future mitigation policies on household consumption should change direct energy use structures in rural areas, cut unreasonable demand for power and chemical products, and focus on urban areas due to their consistently higher magnitudes than rural areas in the structural routes. Impacts originating from government spending should be tackled by managing onsite energy use in 3 major service sectors and promoting cleaner fuels and energy-saving techniques in the transport sector. Policies on investment should concentrate on sectoral interrelationships along the production chain by setting up standards to regulate upstream industries, especially for the services, construction and equipment manufacturing sectors, which have high demand pulling effects. Apart from the similar methods above, mitigating policies in exports should also consider improving embodied technology and quality in manufactured products to achieve sustainable development. Additionally, detailed sectoral results in the coal extraction industry highlight the onsite energy use management in large domestic companies, emphasize energy structure rearrangement, and indicate resources and energy safety issues. Conclusions based on the construction and public administration sectors reveal that future mitigation in secondary and tertiary industries should be combined with upstream emission intensive industries in a systematic viewpoint to achieve sustainable development. Overall, SPA is a useful tool in empirical studies, and it can be used to analyze national environmental impacts and guide future mitigation policies.
Novel Framework for Reduced Order Modeling of Aero-engine Components
NASA Astrophysics Data System (ADS)
Safi, Ali
The present study focuses on the popular dynamic reduction methods used in design of complex assemblies (millions of Degrees of Freedom) where numerous iterations are involved to achieve the final design. Aerospace manufacturers such as Rolls Royce and Pratt & Whitney are actively seeking techniques that reduce computational time while maintaining accuracy of the models. This involves modal analysis of components with complex geometries to determine the dynamic behavior due to non-linearity and complicated loading conditions. In such a case the sub-structuring and dynamic reduction techniques prove to be an efficient tool to reduce design cycle time. The components whose designs are finalized can be dynamically reduced to mass and stiffness matrices at the boundary nodes in the assembly. These matrices conserve the dynamics of the component in the assembly, and thus avoid repeated calculations during the analysis runs for design modification of other components. This thesis presents a novel framework in terms of modeling and meshing of any complex structure, in this case an aero-engine casing. In this study the affect of meshing techniques on the run time are highlighted. The modal analysis is carried out using an extremely fine mesh to ensure all minor details in the structure are captured correctly in the Finite Element (FE) model. This is used as the reference model, to compare against the results of the reduced model. The study also shows the conditions/criteria under which dynamic reduction can be implemented effectively, proving the accuracy of Criag-Bampton (C.B.) method and limitations of Static Condensation. The study highlights the longer runtime needed to produce the reduced matrices of components compared to the overall runtime of the complete unreduced model. Although once the components are reduced, the assembly run is significantly. Hence the decision to use Component Mode Synthesis (CMS) is to be taken judiciously considering the number of iterations that may be required during the design cycle.
Structural Path Analysis of Fossil Fuel Based CO2 Emissions: A Case Study for China
Yang, Zhiyong; Dong, Wenjie; Xiu, Jinfeng; Dai, Rufeng; Chou, Jieming
2015-01-01
Environmentally extended input-output analysis (EEIOA) has long been used to quantify global and regional environmental impacts and to clarify emission transfers. Structural path analysis (SPA), a technique based on EEIOA, is especially useful for measuring significant flows in this environmental-economic system. This paper constructs an imports-adjusted single-region input-output (SRIO) model considering only domestic final use elements, and it uses the SPA technique to highlight crucial routes along the production chain in both final use and sectoral perspectives. The results indicate that future mitigation policies on household consumption should change direct energy use structures in rural areas, cut unreasonable demand for power and chemical products, and focus on urban areas due to their consistently higher magnitudes than rural areas in the structural routes. Impacts originating from government spending should be tackled by managing onsite energy use in 3 major service sectors and promoting cleaner fuels and energy-saving techniques in the transport sector. Policies on investment should concentrate on sectoral interrelationships along the production chain by setting up standards to regulate upstream industries, especially for the services, construction and equipment manufacturing sectors, which have high demand pulling effects. Apart from the similar methods above, mitigating policies in exports should also consider improving embodied technology and quality in manufactured products to achieve sustainable development. Additionally, detailed sectoral results in the coal extraction industry highlight the onsite energy use management in large domestic companies, emphasize energy structure rearrangement, and indicate resources and energy safety issues. Conclusions based on the construction and public administration sectors reveal that future mitigation in secondary and tertiary industries should be combined with upstream emission intensive industries in a systematic viewpoint to achieve sustainable development. Overall, SPA is a useful tool in empirical studies, and it can be used to analyze national environmental impacts and guide future mitigation policies. PMID:26332222
Analysis of magnetic field levels at KSC
NASA Technical Reports Server (NTRS)
Christodoulou, Christos G.
1994-01-01
The scope of this work is to evaluate the magnetic field levels of distribution systems and other equipment at Kennedy Space Center (KSC). Magnetic fields levels in several operational areas and various facilities are investigated. Three dimensional mappings and contour are provided along with the measured data. Furthermore, the portion of magnetic fields generated by the 60 Hz fundamental frequency and the portion generated by harmonics are examined. Finally, possible mitigation techniques for attenuating fields from electric panels are discussed.
Lichens as bioindicators of air quality. Forest Service general technical report (Final)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stolte, K.; Doty, R.; Mangis, D.
1993-03-01
The report is the result of a workshop held in Denver, Colorado on April 9-11, 1991. It summarizes the current literature and techniques for using lichens to monitor air quality. Experts in lichenology and ecology contributed information on lichen floristics, characterization of monitoring sites, lichen species and communities, identifying lichen species sensitive to pollutants, active monitoring with transplants, chemical analysis of lichens, and case studies as examples of lichen biomonitoring scenarios.
Correcting for the effects of pupil discontinuities with the ACAD method
NASA Astrophysics Data System (ADS)
Mazoyer, Johan; Pueyo, Laurent; N'Diaye, Mamadou; Mawet, Dimitri; Soummer, Rémi; Norman, Colin
2016-07-01
The current generation of ground-based coronagraphic instruments uses deformable mirrors to correct for phase errors and to improve contrast levels at small angular separations. Improving these techniques, several space and ground based instruments are currently developed using two deformable mirrors to correct for both phase and amplitude errors. However, as wavefront control techniques improve, more complex telescope pupil geometries (support structures, segmentation) will soon be a limiting factor for these next generation coronagraphic instruments. The technique presented in this proceeding, the Active Correction of Aperture Discontinuities method, is taking advantage of the fact that most future coronagraphic instruments will include two deformable mirrors, and is proposing to find the shapes and actuator movements to correct for the effect introduced by these complex pupil geometries. For any coronagraph previously designed for continuous apertures, this technique allow to obtain similar performance in contrast with a complex aperture (with segmented and secondary mirror support structures), with high throughput and flexibility to adapt to changing pupil geometry (e.g. in case of segment failure or maintenance of the segments). We here present the results of the parametric analysis realized on the WFIRST pupil for which we obtained high contrast levels with several deformable mirror setups (size, separation between them), coronagraphs (Vortex charge 2, vortex charge 4, APLC) and spectral bandwidths. However, because contrast levels and separation are not the only metrics to maximize the scientific return of an instrument, we also included in this study the influence of these deformable mirror shapes on the throughput of the instrument and sensitivity to pointing jitters. Finally, we present results obtained on another potential space based telescope segmented aperture. The main result of this proceeding is that we now obtain comparable performance than the coronagraphs previously designed for WFIRST. First result from the parametric analysis strongly suggest that the 2 deformable mirror set up (size and distance between them) have a important impact on the performance in contrast and throughput of the final instrument.
A convex optimization approach for identification of human tissue-specific interactomes.
Mohammadi, Shahin; Grama, Ananth
2016-06-15
Analysis of organism-specific interactomes has yielded novel insights into cellular function and coordination, understanding of pathology, and identification of markers and drug targets. Genes, however, can exhibit varying levels of cell type specificity in their expression, and their coordinated expression manifests in tissue-specific function and pathology. Tissue-specific/tissue-selective interaction mechanisms have significant applications in drug discovery, as they are more likely to reveal drug targets. Furthermore, tissue-specific transcription factors (tsTFs) are significantly implicated in human disease, including cancers. Finally, disease genes and protein complexes have the tendency to be differentially expressed in tissues in which defects cause pathology. These observations motivate the construction of refined tissue-specific interactomes from organism-specific interactomes. We present a novel technique for constructing human tissue-specific interactomes. Using a variety of validation tests (Edge Set Enrichment Analysis, Gene Ontology Enrichment, Disease-Gene Subnetwork Compactness), we show that our proposed approach significantly outperforms state-of-the-art techniques. Finally, using case studies of Alzheimer's and Parkinson's diseases, we show that tissue-specific interactomes derived from our study can be used to construct pathways implicated in pathology and demonstrate the use of these pathways in identifying novel targets. http://www.cs.purdue.edu/homes/mohammas/projects/ActPro.html mohammadi@purdue.edu. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Genetic programming based ensemble system for microarray data classification.
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
Genetic Programming Based Ensemble System for Microarray Data Classification
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
McNabb, Matthew; Cao, Yu; Devlin, Thomas; Baxter, Blaise; Thornton, Albert
2012-01-01
Mechanical Embolus Removal in Cerebral Ischemia (MERCI) has been supported by medical trials as an improved method of treating ischemic stroke past the safe window of time for administering clot-busting drugs, and was released for medical use in 2004. The importance of analyzing real-world data collected from MERCI clinical trials is key to providing insights on the effectiveness of MERCI. Most of the existing data analysis on MERCI results has thus far employed conventional statistical analysis techniques. To the best of our knowledge, advanced data analytics and data mining techniques have not yet been systematically applied. To address the issue in this thesis, we conduct a comprehensive study on employing state of the art machine learning algorithms to generate prediction criteria for the outcome of MERCI patients. Specifically, we investigate the issue of how to choose the most significant attributes of a data set with limited instance examples. We propose a few search algorithms to identify the significant attributes, followed by a thorough performance analysis for each algorithm. Finally, we apply our proposed approach to the real-world, de-identified patient data provided by Erlanger Southeast Regional Stroke Center, Chattanooga, TN. Our experimental results have demonstrated that our proposed approach performs well.
NASA Astrophysics Data System (ADS)
Lertwiram, Namzilp; Tran, Gia Khanh; Mizutani, Keiichi; Sakaguchi, Kei; Araki, Kiyomichi
Setting relays can address the shadowing problem between a transmitter (Tx) and a receiver (Rx). Moreover, the Multiple-Input Multiple-Output (MIMO) technique has been introduced to improve wireless link capacity. The MIMO technique can be applied in relay network to enhance system performance. However, the efficiency of relaying schemes and relay placement have not been well investigated with experiment-based study. This paper provides a propagation measurement campaign of a MIMO two-hop relay network in 5GHz band in an L-shaped corridor environment with various relay locations. Furthermore, this paper proposes a Relay Placement Estimation (RPE) scheme to identify the optimum relay location, i.e. the point at which the network performance is highest. Analysis results of channel capacity show that relaying technique is beneficial over direct transmission in strong shadowing environment while it is ineffective in non-shadowing environment. In addition, the optimum relay location estimated with the RPE scheme also agrees with the location where the network achieves the highest performance as identified by network capacity. Finally, the capacity analysis shows that two-way MIMO relay employing network coding has the best performance while cooperative relaying scheme is not effective due to shadowing effect weakening the signal strength of the direct link.
Multidirectional mobilities: Advanced measurement techniques and applications
NASA Astrophysics Data System (ADS)
Ivarsson, Lars Holger
Today high noise-and-vibration comfort has become a quality sign of products in sectors such as the automotive industry, aircraft, components, households and manufacturing. Consequently, already in the design phase of products, tools are required to predict the final vibration and noise levels. These tools have to be applicable over a wide frequency range with sufficient accuracy. During recent decades a variety of tools have been developed such as transfer path analysis (TPA), input force estimation, substructuring, coupling by frequency response functions (FRF) and hybrid modelling. While these methods have a well-developed theoretical basis, their application combined with experimental data often suffers from a lack of information concerning rotational DOFs. In order to measure response in all 6 DOFs (including rotation), a sensor has been developed, whose special features are discussed in the thesis. This transducer simplifies the response measurements, although in practice the excitation of moments appears to be more difficult. Several excitation techniques have been developed to enable measurement of multidirectional mobilities. For rapid and simple measurement of the loaded mobility matrix, a MIMO (Multiple Input Multiple Output) technique is used. The technique has been tested and validated on several structures of different complexity. A second technique for measuring the loaded 6-by-6 mobility matrix has been developed. This technique employs a model of the excitation set-up, and with this model the mobility matrix is determined from sequential measurements. Measurements on ``real'' structures show that both techniques give results of similar quality, and both are recommended for practical use. As a further step, a technique for measuring the unloaded mobilities is presented. It employs the measured loaded mobility matrix in order to calculate compensation forces and moments, which are later applied in order to compensate for the loading of the measurement equipment. The developed measurement techniques have been used in a hybrid coupling of a plate-and-beam structure to study different aspects of the coupling technique. Results show that RDOFs are crucial and have to be included in this case. The importance of stiffness residuals when mobilities are estimated from modal superposition is demonstrated. Finally it is shown that proper curve fitting can correct errors from inconsistently measured data.
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.
Gross-Rother, J; Herrmann, N; Blech, M; Pinnapireddy, S R; Garidel, P; Bakowsky, U
2018-05-30
Particle detection and analysis techniques are essential in biopharmaceutical industries to evaluate the quality of various parenteral formulations regarding product safety, product quality and to meet the regulations set by the authority agencies. Several particle analysis systems are available on the market, but for the operator, it is quite challenging to identify the suitable method to analyze the sample. At the same time these techniques are the basis to gain a better understanding in biophysical processes, e.g. protein interaction and aggregation processes. The STEP-Technology® (Space and Time resolved Extinction Profiles), as used in the analytical photocentrifuge LUMiSizer®, has been shown to be an effective and promising technique to investigate particle suspensions and emulsions in various fields. In this study, we evaluated the potentials and limitations of this technique for biopharmaceutical model samples. For a first experimental approach, we measured silica and polystyrene (PS) particle standard suspensions with given particle density and refractive index (RI). The concluding evaluation was performed using a variety of relevant data sets to demonstrate the significant influences of the particle density for the final particle size distribution (PSD). The most challenging property required for successful detection, turbidity, was stated and limits have been set based on the depicted absorbance value at 320 nm (A320 values). Furthermore, we produced chemically cross-linked protein particle suspensions to model physically "stable" protein aggregates. These results of LUMiSizer® analysis have been compared to the orthogonal methods of nanoparticle tracking analysis (NTA), dynamic light scattering (DLS) and micro-flow imaging (MFI). Sedimentation velocity distributions showed similar tendencies, but the PSDs and absolute size values could not be obtained. In conclusion, we could demonstrate some applications as well as limitations of this technique for biopharmaceutical samples. In comparison to orthogonal methods this technique is a great complementary approach if particle data e.g. density or refractive index can be determined. Copyright © 2018 Elsevier B.V. All rights reserved.
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
Finet, Gérard; Derimay, François; Motreff, Pascal; Guerin, Patrice; Pilet, Paul; Ohayon, Jacques; Darremont, Olivier; Rioufol, Gilles
2015-08-24
This study used a fractal bifurcation bench model to compare 6 optimization sequences for coronary bifurcation provisional stenting, including 1 novel sequence without kissing balloon inflation (KBI), comprising initial proximal optimizing technique (POT) + side-branch inflation (SBI) + final POT, called "re-POT." In provisional bifurcation stenting, KBI fails to improve the rate of major adverse cardiac events. Proximal geometric deformation increases the rate of in-stent restenosis and target lesion revascularization. A bifurcation bench model was used to compare KBI alone, KBI after POT, KBI with asymmetric inflation pressure after POT, and 2 sequences without KBI: initial POT plus SBI, and initial POT plus SBI with final POT (called "re-POT"). For each protocol, 5 stents were tested using 2 different drug-eluting stent designs: that is, a total of 60 tests. Compared with the classic KBI-only sequence and those associating POT with modified KBI, the re-POT sequence gave significantly (p < 0.05) better geometric results: it reduced SB ostium stent-strut obstruction from 23.2 ± 6.0% to 5.6 ± 8.3%, provided perfect proximal stent apposition with almost perfect circularity (ellipticity index reduced from 1.23 ± 0.02 to 1.04 ± 0.01), reduced proximal area overstretch from 24.2 ± 7.6% to 8.0 ± 0.4%, and reduced global strut malapposition from 40 ± 6.2% to 2.6 ± 1.4%. In comparison with 5 other techniques, the re-POT sequence significantly optimized the final result of provisional coronary bifurcation stenting, maintaining circular geometry while significantly reducing SB ostium strut obstruction and global strut malapposition. These experimental findings confirm that provisional stenting may be optimized more effectively without KBI using re-POT. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Oswald, Hayden; Molthan, Andrew L.
2011-01-01
Satellite remote sensing has gained widespread use in the field of operational meteorology. Although raw satellite imagery is useful, several techniques exist which can convey multiple types of data in a more efficient way. One of these techniques is multispectral compositing. The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed two multispectral satellite imagery products which utilize data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra and Aqua satellites, based upon products currently generated and used by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT). The nighttime microphysics product allows users to identify clouds occurring at different altitudes, but emphasizes fog and low cloud detection. This product improves upon current spectral difference and single channel infrared techniques. Each of the current products has its own set of advantages for nocturnal fog detection, but each also has limiting drawbacks which can hamper the analysis process. The multispectral product combines each current product with a third channel difference. Since the final image is enhanced with color, it simplifies the fog identification process. Analysis has shown that the nighttime microphysics imagery product represents a substantial improvement to conventional fog detection techniques, as well as provides a preview of future satellite capabilities to forecasters.
Logistic regression for risk factor modelling in stuttering research.
Reed, Phil; Wu, Yaqionq
2013-06-01
To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.
Improving microstructural quantification in FIB/SEM nanotomography.
Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G
2018-01-01
FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.
Srbek, Jan; Klejdus, Bořivoj; Douša, Michal; Břicháč, Jiří; Stasiak, Pawel; Reitmajer, Josef; Nováková, Lucie
2014-12-01
In this study, direct analysis in real time-mass spectrometry (DART-MS) was assessed for the analysis of various pharmaceutical formulations with intention to summarize possible applications for the routine pharmaceutical development. As DART is an ambient ionization technique, it allows direct analysis of pharmaceutical samples in solid or liquid form without complex sample preparation, which is often the most time-consuming part of the analytical method. This makes the technique suitable for many application fields, including pharmaceutical drug development. DART mass spectra of more than twenty selected tablets and other common pharmaceutical formulations, i.e. injection solutions, ointments and suppositories developed in the pharmaceutical industry during several recent years are presented. Moreover, as thin-layer chromatography (TLC) is still very popular for the monitoring of the reactions in the synthetic chemistry, several substances were analyzed directly from the TLC plates to demonstrate the simplicity of the technique. Pure substance solutions were spotted onto a TLC plate and then analyzed with DART without separation. This was the first DART-MS study of pharmaceutical dosage forms using DART-Orbitrap combination. The duration of sample analysis by the DART-MS technique lasted several seconds, allowing enough time to collect sufficient number of data points for compound identification. The experimental setup provided excellent mass accuracy and high resolution of the mass spectra which allowed unambiguous identification of the compounds of interest. Finally, DART mass spectrometry was also used for the monitoring of the selected impurity distribution in the atorvastatin tablets. These measurements demonstrated DART to be robust ionization technique, which provided easy-to-interpret mass spectra for the broad range of compounds. DART has high-throughput potential for various types of pharmaceutical analyses and therefore eliminates the time for sample cleanup and chromatographic separation. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Latorre, Borja; Peña-Sancho, Carolina; Angulo-Jaramillo, Rafaël; Moret-Fernández, David
2015-04-01
Measurement of soil hydraulic properties is of paramount importance in fields such as agronomy, hydrology or soil science. Fundamented on the analysis of the Haverkamp et al. (1994) model, the aim of this paper is to explain a technique to estimate the soil hydraulic properties (sorptivity, S, and hydraulic conductivity, K) from the full-time cumulative infiltration curves. The method (NSH) was validated by means of 12 synthetic infiltration curves generated with HYDRUS-3D from known soil hydraulic properties. The K values used to simulate the synthetic curves were compared to those estimated with the proposed method. A procedure to identify and remove the effect of the contact sand layer on the cumulative infiltration curve was also developed. A sensitivity analysis was performed using the water level measurement as uncertainty source. Finally, the procedure was evaluated using different infiltration times and data noise. Since a good correlation between the K used in HYDRUS-3D to model the infiltration curves and those estimated by the NSH method was obtained, (R2 =0.98), it can be concluded that this technique is robust enough to estimate the soil hydraulic conductivity from complete infiltration curves. The numerical procedure to detect and remove the influence of the contact sand layer on the K and S estimates seemed to be robust and efficient. An effect of the curve infiltration noise on the K estimate was observed, which uncertainty increased with increasing noise. Finally, the results showed that infiltration time was an important factor to estimate K. Lower values of K or smaller uncertainty needed longer infiltration times.
On application of image analysis and natural language processing for music search
NASA Astrophysics Data System (ADS)
Gwardys, Grzegorz
2013-10-01
In this paper, I investigate a problem of finding most similar music tracks using, popular in Natural Language Processing, techniques like: TF-IDF and LDA. I de ned document as music track. Each music track is transformed to spectrogram, thanks that, I can use well known techniques to get words from images. I used SURF operation to detect characteristic points and novel approach for their description. The standard kmeans was used for clusterization. Clusterization is here identical with dictionary making, so after that I can transform spectrograms to text documents and perform TF-IDF and LDA. At the final, I can make a query in an obtained vector space. The research was done on 16 music tracks for training and 336 for testing, that are splitted in four categories: Hiphop, Jazz, Metal and Pop. Although used technique is completely unsupervised, results are satisfactory and encouraging to further research.
Boix, Macarena; Cantó, Begoña
2013-04-01
Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells.
NASA Technical Reports Server (NTRS)
Crawford, Daniel J.; Burdette, Daniel W.; Capron, William R.
1993-01-01
The methodology and techniques used to collect and analyze look-point position data from a real-time ATC display-format comparison experiment are documented. That study compared the delivery precision and controller workload of three final approach spacing aid display formats. Using an oculometer, controller lookpoint position data were collected, associated with gaze objects (e.g., moving aircraft) on the ATC display, and analyzed to determine eye-scan behavior. The equipment involved and algorithms for saving, synchronizing with the ATC simulation output, and filtering the data are described. Target (gaze object) and cross-check scanning identification algorithms are also presented. Data tables are provided of total dwell times, average dwell times, and cross-check scans. Flow charts, block diagrams, file record descriptors, and source code are included. The techniques and data presented are intended to benefit researchers in other studies that incorporate non-stationary gaze objects and oculometer equipment.
Development and fabrication of patient-specific knee implant using additive manufacturing techniques
NASA Astrophysics Data System (ADS)
Zammit, Robert; Rochman, Arif
2017-10-01
Total knee replacement is the most effective treatment to relief pain and restore normal function in a diseased knee joint. The aim of this research was to develop a patient-specific knee implant which can be fabricated using additive manufacturing techniques and has reduced wear rates using a highly wear resistant materials. The proposed design was chosen based on implant requirements, such as reduction in wear rates as well as strong fixation. The patient-specific knee implant improves on conventional knee implants by modifying the articulating surfaces and bone-implant interfaces. Moreover, tribological tests of different polymeric wear couples were carried out to determine the optimal materials to use for the articulating surfaces. Finite element analysis was utilized to evaluate the stresses sustained by the proposed design. Finally, the patient-specific knee implant was successfully built using additive manufacturing techniques.
Burnett, T. L.; McDonald, S. A.; Gholinia, A.; Geurts, R.; Janus, M.; Slater, T.; Haigh, S. J.; Ornek, C.; Almuaili, F.; Engelberg, D. L.; Thompson, G. E.; Withers, P. J.
2014-01-01
Increasingly researchers are looking to bring together perspectives across multiple scales, or to combine insights from different techniques, for the same region of interest. To this end, correlative microscopy has already yielded substantial new insights in two dimensions (2D). Here we develop correlative tomography where the correlative task is somewhat more challenging because the volume of interest is typically hidden beneath the sample surface. We have threaded together x-ray computed tomography, serial section FIB-SEM tomography, electron backscatter diffraction and finally TEM elemental analysis all for the same 3D region. This has allowed observation of the competition between pitting corrosion and intergranular corrosion at multiple scales revealing the structural hierarchy, crystallography and chemistry of veiled corrosion pits in stainless steel. With automated correlative workflows and co-visualization of the multi-scale or multi-modal datasets the technique promises to provide insights across biological, geological and materials science that are impossible using either individual or multiple uncorrelated techniques. PMID:24736640
NASA Astrophysics Data System (ADS)
Le, Thien-Phu
2017-10-01
The frequency-scale domain decomposition technique has recently been proposed for operational modal analysis. The technique is based on the Cauchy mother wavelet. In this paper, the approach is extended to the Morlet mother wavelet, which is very popular in signal processing due to its superior time-frequency localization. Based on the regressive form and an appropriate norm of the Morlet mother wavelet, the continuous wavelet transform of the power spectral density of ambient responses enables modes in the frequency-scale domain to be highlighted. Analytical developments first demonstrate the link between modal parameters and the local maxima of the continuous wavelet transform modulus. The link formula is then used as the foundation of the proposed modal identification method. Its practical procedure, combined with the singular value decomposition algorithm, is presented step by step. The proposition is finally verified using numerical examples and a laboratory test.
Facilitating LOS Debriefings: A Training Manual
NASA Technical Reports Server (NTRS)
McDonnell, Lori K.; Jobe, Kimberly K.; Dismukes, R. Key
1997-01-01
This manual is a practical guide to help airline instructors effectively facilitate debriefings of Line Oriented Simulations (LOS). It is based on a recently completed study of Line Oriented Flight Training (LOFT) debriefings at several U.S. airlines. This manual presents specific facilitation tools instructors can use to achieve debriefing objectives. The approach of the manual is to be flexible so it can be tailored to the individual needs of each airline. Part One clarifies the purpose and objectives of facilitation in the LOS setting. Part Two provides recommendations for clarifying roles and expectations and presents a model for organizing discussion. Part Tree suggests techniques for eliciting active crew participation and in-depth analysis and evaluation. Finally, in Part Four, these techniques are organized according to the facilitation model. Examples of how to effectively use the techniques are provided throughout, including strategies to try when the debriefing objectives are not being fully achieved.
aCGH-MAS: Analysis of aCGH by means of Multiagent System
Benito, Rocío; Bajo, Javier; Rodríguez, Ana Eugenia; Abáigar, María
2015-01-01
There are currently different techniques, such as CGH arrays, to study genetic variations in patients. CGH arrays analyze gains and losses in different regions in the chromosome. Regions with gains or losses in pathologies are important for selecting relevant genes or CNVs (copy-number variations) associated with the variations detected within chromosomes. Information corresponding to mutations, genes, proteins, variations, CNVs, and diseases can be found in different databases and it would be of interest to incorporate information of different sources to extract relevant information. This work proposes a multiagent system to manage the information of aCGH arrays, with the aim of providing an intuitive and extensible system to analyze and interpret the results. The agent roles integrate statistical techniques to select relevant variations and visualization techniques for the interpretation of the final results and to extract relevant information from different sources of information by applying a CBR system. PMID:25874203
Approaches to self-assembly of colloidal monolayers: A guide for nanotechnologists.
Lotito, Valeria; Zambelli, Tomaso
2017-08-01
Self-assembly of quasi-spherical colloidal particles in two-dimensional (2D) arrangements is essential for a wide range of applications from optoelectronics to surface engineering, from chemical and biological sensing to light harvesting and environmental remediation. Several self-assembly approaches have flourished throughout the years, with specific features in terms of complexity of the implementation, sensitivity to process parameters, characteristics of the final colloidal assembly. Selecting the proper method for a given application amidst the vast literature in this field can be a challenging task. In this review, we present an extensive classification and comparison of the different techniques adopted for 2D self-assembly in order to provide useful guidelines for scientists approaching this field. After an overview of the main applications of 2D colloidal assemblies, we describe the main mechanisms underlying their formation and introduce the mathematical tools commonly used to analyse their final morphology. Subsequently, we examine in detail each class of self-assembly techniques, with an explanation of the physical processes intervening in crystallization and a thorough investigation of the technical peculiarities of the different practical implementations. We point out the specific characteristics of the set-ups and apparatuses developed for self-assembly in terms of complexity, requirements, reproducibility, robustness, sensitivity to process parameters and morphology of the final colloidal pattern. Such an analysis will help the reader to individuate more easily the approach more suitable for a given application and will draw the attention towards the importance of the details of each implementation for the final results. Copyright © 2017 Elsevier B.V. All rights reserved.
Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)
2001-01-01
The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David Wilkins
2012-03-20
This thesis presents the first measurement of 6 hadronic event shapes in proton-proton collisions at a center-of-mass energy of {radical}s = 7 TeV using the ATLAS detector at the Large Hadron Collider. Results are presented at the particle-level, permitting comparisons to multiple Monte Carlo event generator tools. Numerous tools and techniques that enable detailed analysis of the hadronic final state at high luminosity are described. The approaches presented utilize the dual strengths of the ATLAS calorimeter and tracking systems to provide high resolution and robust measurements of the hadronic jets that constitute both a background and a signal throughout ATLASmore » physics analyses. The study of the hadronic final state is then extended to jet substructure, where the energy flow and topology within individual jets is studied at the detector level and techniques for estimating systematic uncertainties for such measurements are commissioned in the first data. These first substructure measurements in ATLAS include the jet mass and sub-jet multiplicity as well as those concerned with multi-body hadronic decays and color flow within jets. Finally, the first boosted hadronic object observed at the LHC - the decay of the top quark to a single jet - is presented.« less
Metri, Malasiddappa; Hegde, Swaroop; Dinesh, K; Indiresha, H N; Nagaraj, Shruthi; Bhandi, Shilpa H
2015-11-01
To evaluate the effectiveness of two final irrigation techniques for the removal of precipitate formed by the interaction between sodium hypochlorite (NaOCl) and chlorhexidine (CHX). Sixty freshly extracted human maxillary incisor teeth were taken and randomly divided into three groups, containing 20 teeth each. Group 1 (control group), were irrigated with 5 ml of 2.5% NaOCl and a final flush with 5 ml of 2% chlorhexidine. Group 2 were irrigated with 5 ml of 2.5% NaOCl and 5 ml of 2% chlorhexidine followed by 5 ml of saline and agitated with F-files. Group 3 were irrigated with 5 ml of 2.5% NaOCl and 5 ml of 2% chlorhexidine followed by 5 ml of 15% citric acid and passively agitated with ultrasonics. A thin longitudinal groove was made along the buccal and lingual aspect of the root using diamond disks and split with chisel and mallet. Both halves of the split tooth will be examined under stereomicroscope. Results were tabulated and analyzed statistically using analysis of variance (ANOVA) and Mann-Whitney U test. There was a significant difference between the mean values (p < 0.05) in groups 2 and 3 compared to group 1 at each level. Passive ultrasonic irrigation is more effective than the F-file agitation technique to remove the precipitate at all three levels measured. Combination of sodium hypochlorite and chlorhexidine irrigation protocol has been practiced since from many years to achieve good results. However, it has adverse effect in the form of precipitate and which is considered to be a carcinogenic in nature, hence this precipitate should be removed.
Purification and proteomic analysis of plant plasma membranes.
Alexandersson, Erik; Gustavsson, Niklas; Bernfur, Katja; Karlsson, Adine; Kjellbom, Per; Larsson, Christer
2008-01-01
All techniques needed for proteomic analyses of plant plasma membranes are described in detail, from isolation of plasma membranes to protein identification by mass spectrometry (MS). Plasma membranes are isolated by aqueous two-phase partitioning yielding vesicles with a cytoplasmic side-in orientation and a purity of about 95%. These vesicles are turned inside-out by treatment with Brij 58, which removes soluble contaminating proteins enclosed in the vesicles as well as loosely attached proteins. The final plasma membrane preparation thus retains all integral proteins and many peripheral proteins. Proteins are separated by one-dimensional sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE), and protein bands are excised and digested with trypsin. Peptides in tryptic digests are separated by nanoflow liquid chromatography and either fed directly into an ESI-MS or spotted onto matrix-assisted laser desorption ionization (MALDI) plates for analysis with MALDI-MS. Finally, data processing and database searching are used for protein identification to define a plasma membrane proteome.
Acquisition and analysis of accelerometer data
NASA Astrophysics Data System (ADS)
Verges, Keith R.
1990-08-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
Acquisition and analysis of accelerometer data
NASA Technical Reports Server (NTRS)
Verges, Keith R.
1990-01-01
Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.
Performance analysis and prediction in triathlon.
Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B
2016-01-01
Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.
Nonlinear multivariate and time series analysis by neural network methods
NASA Astrophysics Data System (ADS)
Hsieh, William W.
2004-03-01
Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.
EDNA: Expert fault digraph analysis using CLIPS
NASA Technical Reports Server (NTRS)
Dixit, Vishweshwar V.
1990-01-01
Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.
Digital image processing and analysis for activated sludge wastewater treatment.
Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed
2015-01-01
Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beer, G K; Hendrix, J L; Rowe, J
1998-06-26
The stray light or "ghost" analysis of the National Ignition Facility's (NIP) Final Optics Assembly (FOA) has proved to be one of the most complex ghost analyses ever attempted. The NIF FOA consists of a bundle of four beam lines that: 1) provides the vacuum seal to the target chamber, 2) converts 1ω to 3ω light, 3) focuses the light on the target, 4) separates a fraction of the 3ω beam for energy diagnostics, 5) separates the three wavelengths to diffract unwanted 1ω & 2ω light away from the target, 6) provides spatial beam smoothing, and 7) provides a debrismore » barrier between the target chamber and the switchyard mirrors. The three wavelengths of light and seven optical elements with three diffractive optic surfaces generate three million ghosts through 4 th order. Approximately 24,000 of these ghosts have peak fluence exceeding 1 J/cm 2. The shear number of ghost paths requires a visualization method that allows overlapping ghosts on optics and mechanical components to be summed and then mapped to the optical and mechanical component surfaces in 3D space. This paper addresses the following aspects of the NIF Final Optics Ghost analysis: 1) materials issues for stray light mitigation, 2) limitations of current software tools (especially in modeling diffractive optics), 3) computer resource limitations affecting automated coherent raytracing, 4) folding the stray light analysis into the opto-mechanical design process, 5) analysis and visualization tools from simple hand calculations to specialized stray light analysis computer codes, and 6) attempts at visualizing these ghosts using a CAD model and another using a high end data visualization software approach.« less
NASA Astrophysics Data System (ADS)
Yamazaki, Takaharu; Futai, Kazuma; Tomita, Tetsuya; Sato, Yoshinobu; Yoshikawa, Hideki; Tamura, Shinichi; Sugamoto, Kazuomi
2011-03-01
To achieve 3D kinematic analysis of total knee arthroplasty (TKA), 2D/3D registration techniques, which use X-ray fluoroscopic images and computer-aided design (CAD) model of the knee implant, have attracted attention in recent years. These techniques could provide information regarding the movement of radiopaque femoral and tibial components but could not provide information of radiolucent polyethylene insert, because the insert silhouette on X-ray image did not appear clearly. Therefore, it was difficult to obtain 3D kinemaitcs of polyethylene insert, particularly mobile-bearing insert that move on the tibial component. This study presents a technique and the accuracy for 3D kinematic analysis of mobile-bearing insert in TKA using X-ray fluoroscopy, and finally performs clinical applications. For a 3D pose estimation technique of the mobile-bearing insert in TKA using X-ray fluoroscopy, tantalum beads and CAD model with its beads are utilized, and the 3D pose of the insert model is estimated using a feature-based 2D/3D registration technique. In order to validate the accuracy of the present technique, experiments including computer simulation test were performed. The results showed the pose estimation accuracy was sufficient for analyzing mobile-bearing TKA kinematics (the RMS error: about 1.0 mm, 1.0 degree). In the clinical applications, seven patients with mobile-bearing TKA in deep knee bending motion were studied and analyzed. Consequently, present technique enables us to better understand mobile-bearing TKA kinematics, and this type of evaluation was thought to be helpful for improving implant design and optimizing TKA surgical techniques.
[The choice of color in fixed prosthetics: what steps should be followed for a reliable outcome?].
Vanheusden, Alain; Mainjot, Amélie
2004-01-01
The creation of a perfectly-matched esthetic fixed restoration is undeniably one of the most difficult challenges in modern dentistry. The final outcome depends on several essential steps: the use of an appropriate light source, the accurate analysis and correct evaluation of patient's teeth parameters (morphology, colour, surface texture,...), the clear and precise transmission of this data to the laboratory and the sound interpretation of it by a dental technician who masters esthetic prosthetic techniques perfectly. The purpose of this paper was to give a reproducible clinical method to the practitioner in order to achieve a reliable dental colorimetric analysis.
Correlations Between the Contributions of Individual IVS Analysis Centers
NASA Technical Reports Server (NTRS)
Bockmann, Sarah; Artz, Thomas; Nothnagel, Axel
2010-01-01
Within almost all space-geodetic techniques, contributions of different analysis centers (ACs) are combined in order to improve the robustness of the final product. So far, the contributing series are assumed to be independent as each AC processes the observations in different ways. However, the series cannot be completely independent as each analyst uses the same set of original observations and many applied models are subject to conventions used by each AC. In this paper, it is shown that neglecting correlations between the contributing series yields too optimistic formal errors and small, but insignificant, errors in the estimated parameters derived from the adjustment of the combined solution.
Experimental and numerical investigation on laser-assisted bending of pre-loaded metal plate
NASA Astrophysics Data System (ADS)
Nowak, Zdzisław; Nowak, Marcin; Widłaszewski, Jacek; Kurp, Piotr
2018-01-01
The laser forming technique has an important disadvantage, which is the limitation of plastic deformation generated by a single laser beam pass. To increase the plastic deformation it is possible to apply external forces in the laser forming process. In this paper, we investigate the influence of external pre-loads on the laser bending of steel plate. The pre-loads investigated generate bending towards the laser beam. The thermal, elastic-plastic analysis is performed using the commercial nonlinear finite element analysis package ABAQUS. The focus of the paper is to identify how this pattern of the pre-load influence the final bend angle of the plate.
The magnifying glass - A feature space local expansion for visual analysis. [and image enhancement
NASA Technical Reports Server (NTRS)
Juday, R. D.
1981-01-01
The Magnifying Glass Transformation (MGT) technique is proposed, as a multichannel spectral operation yielding visual imagery which is enhanced in a specified spectral vicinity, guided by the statistics of training samples. An application example is that in which the discrimination among spectral neighbors within an interactive display may be increased without altering distant object appearances or overall interpretation. A direct histogram specification technique is applied to the channels within the multispectral image so that a subset of the spectral domain occupies an increased fraction of the domain. The transformation is carried out by obtaining the training information, establishing the condition of the covariance matrix, determining the influenced solid, and initializing the lookup table. Finally, the image is transformed.
Prediction of Microstructure in HAZ of Welds
NASA Astrophysics Data System (ADS)
Khurana, S. P.; Yancey, R.; Jung, G.
2004-06-01
A modeling technique for predicting microstructure in the heat-affected zone (HAZ) of the hypoeutectoid steels is presented. This technique aims at predicting the phase fractions of ferrite, pearlite, bainite and martensite present in the HAZ after the cool down of a weld. The austenite formation kinetics and austenite decomposition kinetics are calculated using the transient temperature profile. The thermal profile in the weld and the HAZ is calculated by finite-element analysis (FEA). Two kinds of austenite decomposition models are included. The final phase fractions are predicted with the help of a continuous cooling transformation (CCT) diagram of the material. In the calculation of phase fractions either the experimental CCT diagram or the mathematically calculated CCT diagram can be used.
Homogenous polynomially parameter-dependent H∞ filter designs of discrete-time fuzzy systems.
Zhang, Huaguang; Xie, Xiangpeng; Tong, Shaocheng
2011-10-01
This paper proposes a novel H(∞) filtering technique for a class of discrete-time fuzzy systems. First, a novel kind of fuzzy H(∞) filter, which is homogenous polynomially parameter dependent on membership functions with an arbitrary degree, is developed to guarantee the asymptotic stability and a prescribed H(∞) performance of the filtering error system. Second, relaxed conditions for H(∞) performance analysis are proposed by using a new fuzzy Lyapunov function and the Finsler lemma with homogenous polynomial matrix Lagrange multipliers. Then, based on a new kind of slack variable technique, relaxed linear matrix inequality-based H(∞) filtering conditions are proposed. Finally, two numerical examples are provided to illustrate the effectiveness of the proposed approach.
A review of thermal methods and technologies for diabetic foot assessment.
Sousa, Paula; Felizardo, Virginie; Oliveira, Daniel; Couto, Rafael; Garcia, Nuno M
2015-07-01
Temperature analysis has been considered as a complementary method in medical evaluation and diagnosis. Several studies demonstrated that monitoring the temperature variations of the feet of diabetic patients can be helpful in the early identification of diabetic foot manifestations, and also in changing behaviors, which may contribute to reducing its incidence. In this review, several and most used techniques for assessing the temperature of the feet are presented, along with original published work on specific applications in diabetic foot complications. A review of solutions and equipment that operate according to the temperature assessment techniques is also presented. Finally, a comparison between the various technologies is presented, and the authors share their perspective on what will be the state of affairs in 5 years.
Pulsar timing and general relativity
NASA Technical Reports Server (NTRS)
Backer, D. C.; Hellings, R. W.
1986-01-01
Techniques are described for accounting for relativistic effects in the analysis of pulsar signals. Design features of instrumentation used to achieve millisecond accuracy in the signal measurements are discussed. The accuracy of the data permits modeling the pulsar physical characteristics from the natural glitches in the emissions. Relativistic corrections are defined for adjusting for differences between the pulsar motion in its spacetime coordinate system relative to the terrestrial coordinate system, the earth's motion, and the gravitational potentials of solar system bodies. Modifications of the model to allow for a binary pulsar system are outlined, including treatment of the system as a point mass. Finally, a quadrupole model is presented for gravitational radiation and techniques are defined for using pulsars in the search for gravitational waves.
Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem
NASA Astrophysics Data System (ADS)
Zhang, Caiyun
2015-06-01
Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.
Liang, Liang; Liu, Minliang; Martin, Caitlin; Sun, Wei
2018-01-01
Structural finite-element analysis (FEA) has been widely used to study the biomechanics of human tissues and organs, as well as tissue-medical device interactions, and treatment strategies. However, patient-specific FEA models usually require complex procedures to set up and long computing times to obtain final simulation results, preventing prompt feedback to clinicians in time-sensitive clinical applications. In this study, by using machine learning techniques, we developed a deep learning (DL) model to directly estimate the stress distributions of the aorta. The DL model was designed and trained to take the input of FEA and directly output the aortic wall stress distributions, bypassing the FEA calculation process. The trained DL model is capable of predicting the stress distributions with average errors of 0.492% and 0.891% in the Von Mises stress distribution and peak Von Mises stress, respectively. This study marks, to our knowledge, the first study that demonstrates the feasibility and great potential of using the DL technique as a fast and accurate surrogate of FEA for stress analysis. © 2018 The Author(s).
Improving sensor data analysis through diverse data source integration
NASA Astrophysics Data System (ADS)
Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry
2009-05-01
Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.
NASA Astrophysics Data System (ADS)
Chiarucci, Riccardo; Madeo, Dario; Loffredo, Maria I.; Castellani, Eleonora; Santarcangelo, Enrica L.; Mocenni, Chiara
2014-07-01
Assessment of hypnotic susceptibility is usually obtained through the application of psychological instruments. A satisfying classification obtained through quantitative measures is still missing, although it would be very useful for both diagnostic and clinical purposes. Aiming at investigating the relationship between the cortical brain activity and the hypnotic susceptibility level, we propose the combined use of two methodologies - Recurrence Quantification Analysis and Detrended Fluctuation Analysis - both inherited from nonlinear dynamics. Indicators obtained through the application of these techniques to EEG signals of individuals in their ordinary state of consciousness allowed us to obtain a clear discrimination between subjects with high and low susceptibility to hypnosis. Finally a neural network approach was used to perform classification analysis.
Tebani, Abdellah; Afonso, Carlos; Bekri, Soumeya
2018-05-01
This work reports the second part of a review intending to give the state of the art of major metabolic phenotyping strategies. It particularly deals with inherent advantages and limits regarding data analysis issues and biological information retrieval tools along with translational challenges. This Part starts with introducing the main data preprocessing strategies of the different metabolomics data. Then, it describes the main data analysis techniques including univariate and multivariate aspects. It also addresses the challenges related to metabolite annotation and characterization. Finally, functional analysis including pathway and network strategies are discussed. The last section of this review is devoted to practical considerations and current challenges and pathways to bring metabolomics into clinical environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genest-Beaulieu, C.; Bergeron, P., E-mail: genest@astro.umontreal.ca, E-mail: bergeron@astro.umontreal.ca
We present a comparative analysis of atmospheric parameters obtained with the so-called photometric and spectroscopic techniques. Photometric and spectroscopic data for 1360 DA white dwarfs from the Sloan Digital Sky Survey (SDSS) are used, as well as spectroscopic data from the Villanova White Dwarf Catalog. We first test the calibration of the ugriz photometric system by using model atmosphere fits to observed data. Our photometric analysis indicates that the ugriz photometry appears well calibrated when the SDSS to AB{sub 95} zeropoint corrections are applied. The spectroscopic analysis of the same data set reveals that the so-called high-log g problem canmore » be solved by applying published correction functions that take into account three-dimensional hydrodynamical effects. However, a comparison between the SDSS and the White Dwarf Catalog spectra also suggests that the SDSS spectra still suffer from a small calibration problem. We then compare the atmospheric parameters obtained from both fitting techniques and show that the photometric temperatures are systematically lower than those obtained from spectroscopic data. This systematic offset may be linked to the hydrogen line profiles used in the model atmospheres. We finally present the results of an analysis aimed at measuring surface gravities using photometric data only.« less
Noninvasive diagnostic techniques in cardiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verani, M.S.
1983-10-01
Noninvasive cardiology has made notable progress in the last several years. A variety of sophisticated tests are now available to the clinician, providing both anatomic and physiologic information. The result has been an improvement of the level of diagnostic accuracy, which in a final analysis translates into better patient care. Newer tests such as cardiac CAT scan and nuclear magnetic resonance, using incredibly advanced technologies, continue to be investigated and almost certainly will play an important role in cardiovascular diagnosis in year to come.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
Radiative Penguin Decays at the B Factories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koneke, Karsten; /MIT, LNS
2007-11-16
In this article, I review the most recent results in radiative penguin decays from the B factories Belle and BABAR. Most notably, I will talk about the recent new observations in the decays B {yields} ({rho}/{omega}) {gamma}, a new analysis technique in b {yields} s{gamma}, and first measurements of radiative penguin decays in the B{sup 0}{sub s} meson system. Finally, I will summarize the current status and future prospects of radiative penguin B physics at the B factories.
2006-03-01
utilized; for normality, the Shapiro-Wilk test ; and finally, for constant variance the Breusch - Pagan test was used. The Durbin-Watson test results...against this violation, it is of concern with respect to the validity of this model. In order to execute Breusch - Pagan test , it is necessary to obtain...a SSE of 23,297.73. The ρ-value for the Breusch - Pagan test was obtained via use of a spreadsheet program (Microsoft’s Excel) and the expression
[Application of immunologic methods to the analysis of bio-leaching bacteria].
Coto, O; Fernández, A I; León, T; Rodríguez, D
1994-09-01
Pure cultures of Thiobacillus ferrooxidans and mixed cultures of Thiobacillus ferrooxidans and Leptospirillum ferrooxidans isolated from the Matahambre mine (Cuba) were used to fit immunodiffusion and immunoelectron microscopy to the study of iron oxidizing bacteria. The possibilities, advantages and limits of those techniques have been studied from both the identification and the serological characterization points of view. Finally, the efficiency of these methods was tested by applying them to the identification of microorganisms from acidic waters from the mine.
Zargar, Homayoun; Krishnan, Jayram; Autorino, Riccardo; Akca, Oktay; Brandao, Luis Felipe; Laydner, Humberto; Samarasekera, Dinesh; Ko, Oliver; Haber, Georges-Pascal; Kaouk, Jihad H; Stein, Robert J
2014-10-01
Robotic technology is increasingly adopted in urologic surgery and a variety of techniques has been described for minimally invasive treatment of upper tract urothelial cancer (UTUC). To describe a simplified surgical technique of robot-assisted nephroureterectomy (RANU) and to report our single-center surgical outcomes. Patients with history of UTUC treated with this modality between April 2010 and August 2013 were included in the analysis. Institutional review board approval was obtained. Informed consent was signed by all patients. A simplified single-step RANU not requiring repositioning or robot redocking. Lymph node dissection was performed selectively. Descriptive analysis of patients' characteristics, perioperative outcomes, histopathology, and short-term follow-up data was performed. The analysis included 31 patients (mean age: 72.4±10.6 yr; mean body mass index: 26.6±5.1kg/m(2)). Twenty-six of 30 tumors (86%) were high grade. Mean tumor size was 3.1±1.8cm. Of the 31 patients, 13 (42%) had pT3 stage disease. One periureteric positive margin was noted in a patient with bulky T3 disease. The mean number of lymph nodes removed was 9.4 (standard deviation: 5.6; range: 3-21). Two of 14 patients (14%) had positive lymph nodes on final histology. No patients required a blood transfusion. Six patients experienced complications postoperatively, with only one being a high grade (Clavien 3b) complication. Median hospital stay was 5 d. Within the follow-up period, seven patients experienced bladder recurrences and four patients developed metastatic disease. Our RANU technique eliminates the need for patient repositioning or robot redocking. This technique can be safely reproduced, with surgical outcomes comparable to other established techniques. We describe a surgical technique using the da Vinci robot for a minimally invasive treatment of patients presenting with upper tract urothelial cancer. This technique can be safely implemented with good surgical outcomes. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Martin, Clessen J.
Volume 2, the appendix to the final report of Project FAST, consists of prose selections used to study the effects of text reduction techniques on the comprehension and recall of written materials among visually handicapped and hearing impaired subjects. Each selection is presented in various versions such as 10 percent subjective deleted, 20…
Visual Exploration of Semantic Relationships in Neural Word Embeddings
Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.; ...
2017-08-29
Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less
Analysis and synthesis of laughter
NASA Astrophysics Data System (ADS)
Sundaram, Shiva; Narayanan, Shrikanth
2004-10-01
There is much enthusiasm in the text-to-speech community for synthesis of emotional and natural speech. One idea being proposed is to include emotion dependent paralinguistic cues during synthesis to convey emotions effectively. This requires modeling and synthesis techniques of various cues for different emotions. Motivated by this, a technique to synthesize human laughter is proposed. Laughter is a complex mechanism of expression and has high variability in terms of types and usage in human-human communication. People have their own characteristic way of laughing. Laughter can be seen as a controlled/uncontrolled physiological process of a person resulting from an initial excitation in context. A parametric model based on damped simple harmonic motion to effectively capture these diversities and also maintain the individuals characteristics is developed here. Limited laughter/speech data from actual humans and synthesis ease are the constraints imposed on the accuracy of the model. Analysis techniques are also developed to determine the parameters of the model for a given individual or laughter type. Finally, the effectiveness of the model to capture the individual characteristics and naturalness compared to real human laughter has been analyzed. Through this the factors involved in individual human laughter and their importance can be better understood.
Modern separation techniques coupled to high performance mass spectrometry for glycolipid analysis.
Sarbu, Mirela; Zamfir, Alina Diana
2018-01-21
Glycolipids (GLs), involved in biological processes and pathologies, such as viral, neurodegenerative and oncogenic transformations are in the focus of research related to method development for structural analysis. This review highlights modern separation techniques coupled to mass spectrometry (MS) for the investigation of GLs from various biological matrices. First section is dedicated to methods, which, although provide the separation in a non-liquid phase, are able to supply important data on the composition of complex mixtures. While classical thin layer chromatography (TLC) is useful for MS analyses of the fractionated samples, ultramodern ion mobility (IMS) characterized by high reproducibility facilitates to discover minor species and to apply low sample amounts, in addition to providing conformational separation with isomer discrimination. Second section highlights the advantages, applications and limitations of liquid-based separation techniques such as high performance liquid chromatography (HPLC) and hydrophilic interaction liquid chromatography (HILIC) in direct or indirect coupling to MS for glycolipidomics surveys. The on- and off-line capillary electrophoresis (CE) MS, offering a remarkable separation efficiency of GLs is also presented and critically assessed from the technical and application perspective in the final part of the review. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Laser fringe anemometry for aero engine components
NASA Technical Reports Server (NTRS)
Strazisar, A. J.
1986-01-01
Advances in flow measurement techniques in turbomachinery continue to be paced by the need to obtain detailed data for use in validating numerical predictions of the flowfield and for use in the development of empirical models for those flow features which cannot be readily modelled numerically. The use of laser anemometry in turbomachinery research has grown over the last 14 years in response to these needs. Based on past applications and current developments, this paper reviews the key issues which are involved when considering the application of laser anemometry to the measurement of turbomachinery flowfields. Aspects of laser fringe anemometer optical design which are applicable to turbomachinery research are briefly reviewed. Application problems which are common to both laser fringe anemometry (LFA) and laser transit anemometry (LTA) such as seed particle injection, optical access to the flowfield, and measurement of rotor rotational position are covered. The efficiency of various data acquisition schemes is analyzed and issues related to data integrity and error estimation are addressed. Real-time data analysis techniques aimed at capturing flow physics in real time are discussed. Finally, data reduction and analysis techniques are discussed and illustrated using examples taken from several LFA turbomachinery applications.
Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.
Demerdash, Omar N A; Mitchell, Julie C
2012-07-01
Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.
Visual Exploration of Semantic Relationships in Neural Word Embeddings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.
Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Cascade Error Projection: A Learning Algorithm for Hardware Implementation
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Daud, Taher
1996-01-01
In this paper, we workout a detailed mathematical analysis for a new learning algorithm termed Cascade Error Projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters. Furthermore, CEP learning algorithm is operated only on one layer, whereas the other set of weights can be calculated deterministically. In association with the dynamical stepsize change concept to convert the weight update from infinite space into a finite space, the relation between the current stepsize and the previous energy level is also given and the estimation procedure for optimal stepsize is used for validation of our proposed technique. The weight values of zero are used for starting the learning for every layer, and a single hidden unit is applied instead of using a pool of candidate hidden units similar to cascade correlation scheme. Therefore, simplicity in hardware implementation is also obtained. Furthermore, this analysis allows us to select from other methods (such as the conjugate gradient descent or the Newton's second order) one of which will be a good candidate for the learning technique. The choice of learning technique depends on the constraints of the problem (e.g., speed, performance, and hardware implementation); one technique may be more suitable than others. Moreover, for a discrete weight space, the theoretical analysis presents the capability of learning with limited weight quantization. Finally, 5- to 8-bit parity and chaotic time series prediction problems are investigated; the simulation results demonstrate that 4-bit or more weight quantization is sufficient for learning neural network using CEP. In addition, it is demonstrated that this technique is able to compensate for less bit weight resolution by incorporating additional hidden units. However, generation result may suffer somewhat with lower bit weight quantization.
Nanomaterial characterization through image treatment, 3D reconstruction and AI techniques
NASA Astrophysics Data System (ADS)
Lopez de Uralde Huarte, Juan Jose
Nanotechnology is not only the science of the future, but it is indeed the science of today. It is used in all sectors, from health to energy, including information technologies and transport. For the present investigation, we have taken carbon black as a use case. This nanomaterial is mixed with a wide variety of materials to improve their properties, like abrasion resistance, tire and plastic wear or tinting strength in pigments. Nowadays, indirect methods of analysis, like oil absorption or nitrogen adsorption are the most common techniques of the nanomaterial industry. These procedures measure the change in the physical state while adding oil and nitrogen. In this way, the superficial area is estimated and related with the properties of the material. Nevertheless, we have chosen to improve the existent direct methods, which consist in analysing microscopy images of nanomaterials. We have made progress in the image processing treatments and in the extracted features. In fact, some of them have overcome the existing features in the literature. In addition, we have applied, for the first time in the literature, machine learning to aggregate categorization. In this way, we identify automatically their morphology, which will determine the final properties of the material that is mixed with. Finally, we have presented an aggregate reconstruction genetic algorithm that, with only two orthogonal images, provides more information than a tomography, which needs a lot of images. To summarize, we have improved the state of the art in direct analysing techniques, allowing in the near future the replacement of the current indirect techniques.
Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio
2018-03-01
To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging
NASA Astrophysics Data System (ADS)
Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke
2011-12-01
In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
Comparison of ITRF2014 station coordinate input time series of DORIS, VLBI and GNSS
NASA Astrophysics Data System (ADS)
Tornatore, Vincenza; Tanır Kayıkçı, Emine; Roggero, Marco
2016-12-01
In this paper station coordinate time series from three space geodesy techniques that have contributed to the realization of the International Terrestrial Reference Frame 2014 (ITRF2014) are compared. In particular the height component time series extracted from official combined intra-technique solutions submitted for ITRF2014 by DORIS, VLBI and GNSS Combination Centers have been investigated. The main goal of this study is to assess the level of agreement among these three space geodetic techniques. A novel analytic method, modeling time series as discrete-time Markov processes, is presented and applied to the compared time series. The analysis method has proven to be particularly suited to obtain quasi-cyclostationary residuals which are an important property to carry out a reliable harmonic analysis. We looked for common signatures among the three techniques. Frequencies and amplitudes of the detected signals have been reported along with their percentage of incidence. Our comparison shows that two of the estimated signals, having one-year and 14 days periods, are common to all the techniques. Different hypotheses on the nature of the signal having a period of 14 days are presented. As a final check we have compared the estimated velocities and their standard deviations (STD) for the sites that co-located the VLBI, GNSS and DORIS stations, obtaining a good agreement among the three techniques both in the horizontal (1.0 mm/yr mean STD) and in the vertical (0.7 mm/yr mean STD) component, although some sites show larger STDs, mainly due to lack of data, different data spans or noisy observations.
NASA Astrophysics Data System (ADS)
Sandoz, J.-P.; Steenaart, W.
1984-12-01
The nonuniform sampling digital phase-locked loop (DPLL) with sequential loop filter, in which the correction sizes are controlled by the accumulated differences of two additional phase comparators, is graphically analyzed. In the absence of noise and frequency drift, the analysis gives some physical insight into the acquisition and tracking behavior. Taking noise into account, a mathematical model is derived and a random walk technique is applied to evaluate the rms phase error and the mean acquisition time. Experimental results confirm the appropriate simplifying hypotheses used in the numerical analysis. Two related performance measures defined in terms of the rms phase error and the acquisition time for a given SNR are used. These measures provide a common basis for comparing different digital loops and, to a limited extent, also with a first-order linear loop. Finally, the behavior of a modified DPLL under frequency deviation in the presence of Gaussian noise is tested experimentally and by computer simulation.
Generation of phase edge singularities by coplanar three-beam interference and their detection.
Patorski, Krzysztof; Sluzewski, Lukasz; Trusiak, Maciej; Pokorski, Krzysztof
2017-02-06
In recent years singular optics has gained considerable attention in science and technology. Up to now optical vortices (phase point dislocations) have been of main interest. This paper presents the first general analysis of formation of phase edge singularities by coplanar three-beam interference. They can be generated, for example, by three-slit interference or self-imaging in the Fresnel diffraction field of a sinusoidal grating. We derive a general condition for the ratio of amplitudes of interfering beams resulting in phase edge dislocations, lateral separation of dislocations depends on this ratio as well. Analytically derived properties are corroborated by numerical and experimental studies. We develop a simple, robust, common path optical self-imaging configuration aided by a coherent tilted reference wave and spatial filtering. Finally, we propose an automatic fringe pattern analysis technique for detecting phase edge dislocations, based on the continuous wavelet transform. Presented studies open new possibilities for developing grating based sensing techniques for precision metrology of very small phase differences.
NASA Astrophysics Data System (ADS)
Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.
2018-05-01
The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.
Recent advances in capillary ultrahigh pressure liquid chromatography.
Blue, Laura E; Franklin, Edward G; Godinho, Justin M; Grinias, James P; Grinias, Kaitlin M; Lunn, Daniel B; Moore, Stephanie M
2017-11-10
In the twenty years since its initial demonstration, capillary ultrahigh pressure liquid chromatography (UHPLC) has proven to be one of most powerful separation techniques for the analysis of complex mixtures. This review focuses on the most recent advances made since 2010 towards increasing the performance of such separations. Improvements in capillary column preparation techniques that have led to columns with unprecedented performance are described. New stationary phases and phase supports that have been reported over the past decade are detailed, with a focus on their use in capillary formats. A discussion on the instrument developments that have been required to ensure that extra-column effects do not diminish the intrinsic efficiency of these columns during analysis is also included. Finally, the impact of these capillary UHPLC topics on the field of proteomics and ways in which capillary UHPLC may continue to be applied to the separation of complex samples are addressed. Copyright © 2017 Elsevier B.V. All rights reserved.
Performance-based, cost- and time-effective pcb analytical methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alvarado, J. S.
1998-06-11
Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less
Estimating Interaction Effects With Incomplete Predictor Variables
Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining
2014-01-01
The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955
Development of a Software Safety Process and a Case Study of Its Use
NASA Technical Reports Server (NTRS)
Knight, J. C.
1997-01-01
Research in the year covered by this reporting period has been primarily directed toward the following areas: (1) Formal specification of user interfaces; (2) Fault-tree analysis including software; (3) Evaluation of formal specification notations; (4) Evaluation of formal verification techniques; (5) Expanded analysis of the shell architecture concept; (6) Development of techniques to address the problem of information survivability; and (7) Development of a sophisticated tool for the manipulation of formal specifications written in Z. This report summarizes activities under the grant. The technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers. The remainder of this report is organized as follows. In the next section, an overview of the project is given. This is followed by a summary of accomplishments during the reporting period and details of students funded. Seminars presented describing work under this grant are listed in the following section, and the final section lists publications resulting from this grant.
Structure of the Nucleon and its Excitations
NASA Astrophysics Data System (ADS)
Kamleh, Waseem; Leinweber, Derek; Liu, Zhan-wei; Stokes, Finn; Thomas, Anthony; Thomas, Samuel; Wu, Jia-jun
2018-03-01
The structure of the ground state nucleon and its finite-volume excitations are examined from three different perspectives. Using new techniques to extract the relativistic components of the nucleon wave function, the node structure of both the upper and lower components of the nucleon wave function are illustrated. A non-trivial role for gluonic components is manifest. In the second approach, the parity-expanded variational analysis (PEVA) technique is utilised to isolate states at finite momenta, enabling a novel examination of the electric and magnetic form factors of nucleon excitations. Here the magnetic form factors of low-lying odd-parity nucleons are particularly interesting. Finally, the structure of the nucleon spectrum is examined in a Hamiltonian effective field theory analysis incorporating recent lattice-QCD determinations of low-lying two-particle scattering-state energies in the finite volume. The Roper resonance of Nature is observed to originate from multi-particle coupled-channel interactions while the first radial excitation of the nucleon sits much higher at approximately 1.9 GeV.
An inkjet vision measurement technique for high-frequency jetting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwon, Kye-Si, E-mail: kskwon@sch.ac.kr; Jang, Min-Hyuck; Park, Ha Yeong
2014-06-15
Inkjet technology has been used as manufacturing a tool for printed electronics. To increase the productivity, the jetting frequency needs to be increased. When using high-frequency jetting, the printed pattern quality could be non-uniform since the jetting performance characteristics including the jetting speed and droplet volume could vary significantly with increases in jet frequency. Therefore, high-frequency jetting behavior must be evaluated properly for improvement. However, it is difficult to measure high-frequency jetting behavior using previous vision analysis methods, because subsequent droplets are close or even merged. In this paper, we present vision measurement techniques to evaluate the drop formation ofmore » high-frequency jetting. The proposed method is based on tracking target droplets such that subsequent droplets can be excluded in the image analysis by focusing on the target droplet. Finally, a frequency sweeping method for jetting speed and droplet volume is presented to understand the overall jetting frequency effects on jetting performance.« less
Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf
2010-07-01
Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.
Secondary ion mass spectrometry: The application in the analysis of atmospheric particulate matter
Huang, Di; Hua, Xin; Xiu, Guang-Li; ...
2017-07-24
Currently, considerable attention has been paid to atmospheric particulate matter (PM) investigation due to its importance in human health and global climate change. Surface characterization, single particle analysis and depth profiling of PM is important for a better understanding of its formation processes and predicting its impact on the environment and human being. Secondary ion mass spectrometry (SIMS) is a surface technique with high surface sensitivity, high spatial resolution chemical imaging and unique depth profiling capabilities. Recent research shows that SIMS has great potential in analyzing both surface and bulk chemical information of PM. In this review, we give amore » brief introduction of SIMS working principle and survey recent applications of SIMS in PM characterization. In particular, analyses from different types of PM sources by various SIMS techniques were discussed concerning their advantages and limitations. Finally, we propose, the future development and needs of SIMS in atmospheric aerosol measurement with a perspective in broader environmental sciences.« less
Fortunato, Luca; Jeong, Sanghyun; Wang, Yiran; Behzad, Ali R; Leiknes, TorOve
2016-12-01
Fouling in membrane bioreactors (MBR) is acknowledged to be complex and unclear. An integrated characterization methodology was employed in this study to understand the fouling on a gravity-driven submerged MBR (GD-SMBR). It involved the use of different analytical tools, including optical coherence tomography (OCT), liquid chromatography with organic carbon detection (LC-OCD), total organic carbon (TOC), flow cytometer (FCM), adenosine triphosphate analysis (ATP) and scanning electron microscopy (SEM). The three-dimensional (3D) biomass morphology was acquired in a real-time through non-destructive and in situ OCT scanning of 75% of the total membrane surface directly in the tank. Results showed that the biomass layer was homogeneously distributed on the membrane surface. The amount of biomass was selectively linked with final destructive autopsy techniques. The LC-OCD analysis indicated the abundance of low molecular weight (LMW) organics in the fouling composition. Three different SEM techniques were applied to investigate the detailed fouling morphology on the membrane. Copyright © 2016 Elsevier Ltd. All rights reserved.
Analysis of unsteady compressible viscous layers
NASA Technical Reports Server (NTRS)
Power, G. D.; Verdon, J. M.; Kousen, K. A.
1990-01-01
The development of an analysis to predict the unsteady compressible flows in blade boundary layers and wakes is presented. The equations that govern the flows in these regions are transformed using an unsteady turbulent generalization of the Levy-Lees transformation. The transformed equations are solved using a finite difference technique in which the solution proceeds by marching in time and in the streamwise direction. Both laminar and turbulent flows are studied, the latter using algebraic turbulence and transition models. Laminar solutions for a flat plate are shown to approach classical asymptotic results for both high and low frequency unsteady motions. Turbulent flat-plate results are in qualitative agreement with previous predictions and measurements. Finally, the numerical technique is also applied to the stator and rotor of a low-speed turbine stage to determine unsteady effects on surface heating. The results compare reasonably well with measured heat transfer data and indicate that nonlinear effects have minimal impact on the mean and unsteady components of the flow.
Secondary ion mass spectrometry: The application in the analysis of atmospheric particulate matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Di; Hua, Xin; Xiu, Guang-Li
Currently, considerable attention has been paid to atmospheric particulate matter (PM) investigation due to its importance in human health and global climate change. Surface characterization, single particle analysis and depth profiling of PM is important for a better understanding of its formation processes and predicting its impact on the environment and human being. Secondary ion mass spectrometry (SIMS) is a surface technique with high surface sensitivity, high spatial resolution chemical imaging and unique depth profiling capabilities. Recent research shows that SIMS has great potential in analyzing both surface and bulk chemical information of PM. In this review, we give amore » brief introduction of SIMS working principle and survey recent applications of SIMS in PM characterization. In particular, analyses from different types of PM sources by various SIMS techniques were discussed concerning their advantages and limitations. Finally, we propose, the future development and needs of SIMS in atmospheric aerosol measurement with a perspective in broader environmental sciences.« less
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Freire, Carmen S. R.; Coutinho, João A. P.; Silvestre, Armando J. D.; Freire, Mara G.
2016-01-01
Due to their unique properties, in recent years, ionic liquids (ILs) have been largely investigated in the field of analytical chemistry. Particularly during the last sixteen years, they have been successfully applied in the chromatographic and electrophoretic analysis of value-added compounds extracted from biomass. Considering the growing interest in the use of ILs in this field, this critical review provides a comprehensive overview on the improvements achieved using ILs as constituents of mobile or stationary phases in analytical techniques, namely in capillary electrophoresis and its different modes, in high performance liquid chromatography, and in gas chromatography, for the separation and analysis of natural compounds. The impact of the IL chemical structure and the influence of secondary parameters, such as the IL concentration, temperature, pH, voltage and analysis time (when applied), are also critically addressed regarding the achieved separation improvements. Major conclusions on the role of ILs in the separation mechanisms and the performance of these techniques in terms of efficiency, resolution and selectivity are provided. Based on a critical analysis of all published results, some target-oriented ILs are suggested. Finally, current drawbacks and future challenges in the field are highlighted. In particular, the design and use of more benign and effective ILs as well as the development of integrated (and thus more sustainable) extraction–separation processes using IL aqueous solutions are suggested within a green chemistry perspective. PMID:27667965
A review on the determination of isotope ratios of boron with mass spectrometry.
Aggarwal, Suresh Kumar; You, Chen-Feng
2017-07-01
The present review discusses different mass spectrometric techniques-viz, thermal ionization mass spectrometry (TIMS), inductively coupled plasma mass spectrometry (ICPMS), and secondary ion mass spectrometry (SIMS)-used to determine 11 B/ 10 B isotope ratio, and concentration of boron required for various applications in earth sciences, marine geochemistry, nuclear technology, environmental, and agriculture sciences, etc. The details of the techniques-P-TIMS, which uses Cs 2 BO 2 + , N-TIMS, which uses BO 2 - , and MC-ICPMS, which uses B + ions for bulk analysis or B - and B + ions for in situ micro-analysis with SIMS-are highlighted. The capabilities, advantages, limitations, and problems in each mass spectrometric technique are summarized. The results of international interlaboratory comparison experiments conducted at different times are summarized. The certified isotopic reference materials available for boron are also listed. Recent developments in laser ablation (LA) ICPMS and QQQ-ICPMS for solids analysis and MS/MS analysis, respectively, are included. The different aspects of sample preparation and analytical chemistry of boron are summarized. Finally, the future requirements of boron isotope ratios for future applications are also given. Presently, MC-ICPMS provides the best precision and accuracy (0.2-0.4‰) on isotope ratio measurements, whereas N-TIMS holds the potential to analyze smallest amount of boron, but has the issue of bias (+2‰ to 4‰) which needs further investigations. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 36:499-519, 2017. © 2016 Wiley Periodicals, Inc.
Seitz, Kelsey E; Smith, Cynthia R; Marks, Stanley L; Venn-Watson, Stephanie K; Ivančić, Marina
2016-12-01
The objective of this study was to establish a comprehensive technique for ultrasound examination of the dolphin hepatobiliary system and apply this technique to 30 dolphins to determine what, if any, sonographic changes are associated with blood-based indicators of metabolic syndrome (insulin greater than 14 μIU/ml or glucose greater than 112 mg/dl) and iron overload (transferrin saturation greater than 65%). A prospective study of individuals in a cross-sectional population with and without elevated postprandial insulin levels was performed. Twenty-nine bottlenose dolphins ( Tursiops truncatus ) in a managed collection were included in the final data analysis. An in-water ultrasound technique was developed that included detailed analysis of the liver and pancreas. Dolphins with hyperinsulinemia concentrations had larger livers compared with dolphins with nonelevated concentrations. Using stepwise, multivariate regression including blood-based indicators of metabolic syndrome in dolphins, glucose was the best predictor of and had a positive linear association with liver size (P = 0.007, R 2 = 0.24). Bottlenose dolphins are susceptible to metabolic syndrome and associated complications that affect the liver, including fatty liver disease and iron overload. This study facilitated the establishment of a technique for a rapid, diagnostic, and noninvasive ultrasonographic evaluation of the dolphin liver. In addition, the study identified ultrasound-detectable hepatic changes associated primarily with elevated glucose concentration in dolphins. Future investigations will strive to detail the pathophysiological mechanisms for these changes.
Ott, Laura E; Carson, Susan
2014-01-01
Flow cytometry and enzyme-linked immunosorbent assay (ELISA) are commonly used techniques associated with clinical and research applications within the immunology and medical fields. The use of these techniques is becoming increasingly valuable in many life science and engineering disciplines as well. Herein, we report the development and evaluation of a novel half-semester course that focused on introducing undergraduate and graduate students to advance conceptual and technical skills associated with flow cytometry and ELISA, with emphasis on applications, experimental design, and data analysis. This course was offered in the North Carolina State University Biotechnology Program over three semesters and consisted of weekly lectures and laboratories. Students performed and/or analyzed flow cytometry and ELISA in three separate laboratory exercises: (1) identification of transgenic zebrafish hematopoietic cells, (2) analysis of transfection efficiency, and (3) analysis of cytokine production upon lipopolysaccharide stimulation. Student learning outcomes were achieved as demonstrated by multiple means of assessment, including three laboratory reports, a data analysis laboratory practicum, and a cumulative final exam. Further, anonymous student self-assessment revealed increased student confidence in the knowledge and skill sets defined in the learning outcomes. Copyright © 2014 The International Union of Biochemistry and Molecular Biology.
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...
2017-05-23
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
Optimal interpolation analysis of leaf area index using MODIS data
Gu, Yingxin; Belair, Stephane; Mahfouf, Jean-Francois; Deblonde, Godelieve
2006-01-01
A simple data analysis technique for vegetation leaf area index (LAI) using Moderate Resolution Imaging Spectroradiometer (MODIS) data is presented. The objective is to generate LAI data that is appropriate for numerical weather prediction. A series of techniques and procedures which includes data quality control, time-series data smoothing, and simple data analysis is applied. The LAI analysis is an optimal combination of the MODIS observations and derived climatology, depending on their associated errors σo and σc. The “best estimate” LAI is derived from a simple three-point smoothing technique combined with a selection of maximum LAI (after data quality control) values to ensure a higher quality. The LAI climatology is a time smoothed mean value of the “best estimate” LAI during the years of 2002–2004. The observation error is obtained by comparing the MODIS observed LAI with the “best estimate” of the LAI, and the climatological error is obtained by comparing the “best estimate” of LAI with the climatological LAI value. The LAI analysis is the result of a weighting between these two errors. Demonstration of the method described in this paper is presented for the 15-km grid of Meteorological Service of Canada (MSC)'s regional version of the numerical weather prediction model. The final LAI analyses have a relatively smooth temporal evolution, which makes them more appropriate for environmental prediction than the original MODIS LAI observation data. They are also more realistic than the LAI data currently used operationally at the MSC which is based on land-cover databases.
From Pacemaker to Wearable: Techniques for ECG Detection Systems.
Kumar, Ashish; Komaragiri, Rama; Kumar, Manjeet
2018-01-11
With the alarming rise in the deaths due to cardiovascular diseases (CVD), present medical research scenario places notable importance on techniques and methods to detect CVDs. As adduced by world health organization, technological proceeds in the field of cardiac function assessment have become the nucleus and heart of all leading research studies in CVDs in which electrocardiogram (ECG) analysis is the most functional and convenient tool used to test the range of heart-related irregularities. Most of the approaches present in the literature of ECG signal analysis consider noise removal, rhythm-based analysis, and heartbeat detection to improve the performance of a cardiac pacemaker. Advancements achieved in the field of ECG segments detection and beat classification have a limited evaluation and still require clinical approvals. In this paper, approaches on techniques to implement on-chip ECG detector for a cardiac pacemaker system are discussed. Moreover, different challenges regarding the ECG signal morphology analysis deriving from medical literature is extensively reviewed. It is found that robustness to noise, wavelet parameter choice, numerical efficiency, and detection performance are essential performance indicators required by a state-of-the-art ECG detector. Furthermore, many algorithms described in the existing literature are not verified using ECG data from the standard databases. Some ECG detection algorithms show very high detection performance with the total number of detected QRS complexes. However, the high detection performance of the algorithm is verified using only a few datasets. Finally, gaps in current advancements and testing are identified, and the primary challenge remains to be implementing bullseye test for morphology analysis evaluation.
Three Different Methods of Estimating LAI in a Small Watershed
NASA Astrophysics Data System (ADS)
Speckman, H. N.; Ewers, B. E.; Beverly, D.
2015-12-01
Leaf area index (LAI) is a critical input of models that improve predictive understanding of ecology, hydrology, and climate change. Multiple techniques exist to quantify LAI, most of which are labor intensive, and all often fail to converge on similar estimates. . Recent large-scale bark beetle induced mortality greatly altered LAI, which is now dominated by younger and more metabolically active trees compared to the pre-beetle forest. Tree mortality increases error in optical LAI estimates due to the lack of differentiation between live and dead branches in dense canopy. Our study aims to quantify LAI using three different LAI methods, and then to compare the techniques to each other and topographic drivers to develop an effective predictive model of LAI. This study focuses on quantifying LAI within a small (~120 ha) beetle infested watershed in Wyoming's Snowy Range Mountains. The first technique estimated LAI using in-situ hemispherical canopy photographs that were then analyzed with Hemisfer software. The second LAI estimation technique was use of the Kaufmann 1982 allometrerics from forest inventories conducted throughout the watershed, accounting for stand basal area, species composition, and the extent of bark beetle driven mortality. The final technique used airborne light detection and ranging (LIDAR) first DMS returns, which were used to estimating canopy heights and crown area. LIDAR final returns provided topographical information and were then ground-truthed during forest inventories. Once data was collected, a fractural analysis was conducted comparing the three methods. Species composition was driven by slope position and elevation Ultimately the three different techniques provided very different estimations of LAI, but each had their advantage: estimates from hemisphere photos were well correlated with SWE and snow depth measurements, forest inventories provided insight into stand health and composition, and LIDAR were able to quickly and efficiently cover a very large area.
Carmichael, Mary C; St Clair, Candace; Edwards, Andrea M; Barrett, Peter; McFerrin, Harris; Davenport, Ian; Awad, Mohamed; Kundu, Anup; Ireland, Shubha Kale
2016-01-01
Xavier University of Louisiana leads the nation in awarding BS degrees in the biological sciences to African-American students. In this multiyear study with ∼5500 participants, data-driven interventions were adopted to improve student academic performance in a freshman-level general biology course. The three hour-long exams were common and administered concurrently to all students. New exam questions were developed using Bloom's taxonomy, and exam results were analyzed statistically with validated assessment tools. All but the comprehensive final exam were returned to students for self-evaluation and remediation. Among other approaches, course rigor was monitored by using an identical set of 60 questions on the final exam across 10 semesters. Analysis of the identical sets of 60 final exam questions revealed that overall averages increased from 72.9% (2010) to 83.5% (2015). Regression analysis demonstrated a statistically significant correlation between high-risk students and their averages on the 60 questions. Additional analysis demonstrated statistically significant improvements for at least one letter grade from midterm to final and a 20% increase in the course pass rates over time, also for the high-risk population. These results support the hypothesis that our data-driven interventions and assessment techniques are successful in improving student retention, particularly for our academically at-risk students. © 2016 M. C. Carmichael et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
The development of an audit technique to assess the quality of safety barrier management.
Guldenmund, Frank; Hale, Andrew; Goossens, Louis; Betten, Jeroen; Duijm, Nijs Jan
2006-03-31
This paper describes the development of a management model to control barriers devised to prevent major hazard scenarios. Additionally, an audit technique is explained that assesses the quality of such a management system. The final purpose of the audit technique is to quantify those aspects of the management system that have a direct impact on the reliability and effectiveness of the barriers and, hence, the probability of the scenarios involved. First, an outline of the management model is given and its elements are explained. Then, the development of the audit technique is described. Because the audit technique uses actual major hazard scenarios and barriers within these as its focus, the technique achieves a concreteness and clarity that many other techniques often lack. However, this strength is also its limitation, since the full safety management system is not covered with the technique. Finally, some preliminary experiences obtained from several test sites are compiled and discussed.
NASA Astrophysics Data System (ADS)
Lucier, Amie Marie
The role of geomechanical analysis in characterizing the feasibility of CO2 sequestration in deep saline aquifers is addressed in two investigations. The first investigation was completed as part of the Ohio River Valley CO2 Storage Project. We completed a geomechanical analysis of the Rose Run Sandstone, a potential injection zone, and its adjacent formations at the American Electric Power's 1.3 GW Mountaineer Power Plant in New Haven, West Virginia. The results of this analysis were then used to evaluate the feasibility of anthropogenic CO2 sequestration in the potential injection zone. First, we incorporated the results of the geomechanical analysis with a geostatistical aquifer model in CO2 injection flow simulations to test the effects of introducing a hydraulic fracture to increase injectivity. Then, we determined that horizontal injection wells at the Mountaineer site are feasible because the high rock strength ensures that such wells would be stable in the local stress state. Finally, we evaluated the potential for injection-induced seismicity. The second investigation concerning CO2 sequestration was motivated by the modeling and fluid flow simulation results from the first study. The geomechanics-based assessment workflow follows a bottom-up approach for evaluating regional deep saline aquifer CO2 injection and storage feasibility. The CO2 storage capacity of an aquifer is a function of its porous volume as well as its CO2 injectivity. For a saline aquifer to be considered feasible in this assessment it must be able to store a specified amount of CO2 at a reasonable cost per ton of CO 2. The proposed assessment workflow has seven steps. The workflow was applied to a case study of the Rose Run sandstone in the eastern Ohio River Valley. We found that it is feasible in this region to inject and store 113 Mt CO2/yr for 30 years at an associated well cost of less than 1.31 US$/t CO2, but only if injectivity enhancement techniques such as hydraulic fracturing and injection induced micro-seismicity are implemented. The second issue to which we apply geomechanical analysis in this thesis is mining-induced stress perturbations and induced seismicity in the TauTona gold mine, which is located in the Witwatersrand Basin of South Africa and is one of the deepest underground mines in the world. In the first investigation, we developed and tested a new technique for determining the virgin stress state near the TauTona gold mine. This technique follows an iterative forward modeling approach that combines observations of drilling induced borehole failures in borehole images, boundary element modeling of the mining-induced stress perturbations, and forward modeling of borehole failures based on the results of the boundary element modeling. The final result was a well constrained range of principal stress orientations and magnitudes that are consistent with all the observed failures and other stress indicators. In the second investigation, we used this constrained stress state to examine the likelihood of faulting to occur both on pre-existing fault planes that are optimally oriented to the virgin stress state and on faults affected by the mining-perturbed stress field, the latter of which is calculated with boundary element modeling. We made several recommendations that could potentially increase safety in deep South African mines as development continues. Finally, the third issue addressed in this thesis is the detection of stress-induced shear wave velocity anisotropy in a sub-salt environment. In this study, we tested a technique proposed by Boness and Zoback (2006) to identify structure-induced velocity anisotropy and isolate possible stress-induced velocity anisotropy. The investigation used cross-dipole sonic data from three deep water sub-salt wells in the Gulf of Mexico. First, we determined the parameters necessary to ensure the quality of the fast azimuth data used in our analysis. We then characterized the quality controlled measured fast directions as either structure-induced or stress-induced based on the results of the Boness and Zoback (2006) technique. We found that this technique supplements the use of dispersion curve analysis for characterizing anisotropy mechanisms. We also find that this technique has the potential to provide information on the stresses that can be used to validate numerical models of salt-related stress perturbations. (Abstract shortened by UMI.)
A Smoothing Technique for the Multifractal Analysis of a Medium Voltage Feeders Electric Current
NASA Astrophysics Data System (ADS)
de Santis, Enrico; Sadeghian, Alireza; Rizzi, Antonello
2017-12-01
The current paper presents a data-driven detrending technique allowing to smooth complex sinusoidal trends from a real-world electric load time series before applying the Detrended Multifractal Fluctuation Analysis (MFDFA). The algorithm we call Smoothed Sort and Cut Fourier Detrending (SSC-FD) is based on a suitable smoothing of high power periodicities operating directly in the Fourier spectrum through a polynomial fitting technique of the DFT. The main aim consists of disambiguating the characteristic slow varying periodicities, that can impair the MFDFA analysis, from the residual signal in order to study its correlation properties. The algorithm performances are evaluated on a simple benchmark test consisting of a persistent series where the Hurst exponent is known, with superimposed ten sinusoidal harmonics. Moreover, the behavior of the algorithm parameters is assessed computing the MFDFA on the well-known sunspot data, whose correlation characteristics are reported in literature. In both cases, the SSC-FD method eliminates the apparent crossover induced by the synthetic and natural periodicities. Results are compared with some existing detrending methods within the MFDFA paradigm. Finally, a study of the multifractal characteristics of the electric load time series detrendended by the SSC-FD algorithm is provided, showing a strong persistent behavior and an appreciable amplitude of the multifractal spectrum that allows to conclude that the series at hand has multifractal characteristics.
Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT
NASA Technical Reports Server (NTRS)
Fagundo, Arturo
1994-01-01
Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.
Pereira, Maria J; Amaral, Joao S; Silva, Nuno J O; Amaral, Vitor S
2016-12-01
Determining and acting on thermo-physical properties at the nanoscale is essential for understanding/managing heat distribution in micro/nanostructured materials and miniaturized devices. Adequate thermal nano-characterization techniques are required to address thermal issues compromising device performance. Scanning thermal microscopy (SThM) is a probing and acting technique based on atomic force microscopy using a nano-probe designed to act as a thermometer and resistive heater, achieving high spatial resolution. Enabling direct observation and mapping of thermal properties such as thermal conductivity, SThM is becoming a powerful tool with a critical role in several fields, from material science to device thermal management. We present an overview of the different thermal probes, followed by the contribution of SThM in three currently significant research topics. First, in thermal conductivity contrast studies of graphene monolayers deposited on different substrates, SThM proves itself a reliable technique to clarify the intriguing thermal properties of graphene, which is considered an important contributor to improve the performance of downscaled devices and materials. Second, SThM's ability to perform sub-surface imaging is highlighted by thermal conductivity contrast analysis of polymeric composites. Finally, an approach to induce and study local structural transitions in ferromagnetic shape memory alloy Ni-Mn-Ga thin films using localized nano-thermal analysis is presented.
Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640
Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.
Study of the solid state of carbamazepine after processing with gas anti-solvent technique.
Moneghini, M; Kikic, I; Voinovich, D; Perissutti, B; Alessi, P; Cortesi, A; Princivalle, F; Solinas, D
2003-09-01
The purpose of this study was to investigate the influence of supercritical CO2 processing on the physico-chemical properties of carbamazepine, a poorly soluble drug. The gas anti-solvent (GAS) technique was used to precipitate the drug from three different solvents (acetone, ethylacetate and dichloromethane) to study how they would affect the final product. The samples were analysed before and after treatment by scanning electron microscopy analysis and laser granulometry for possible changes in the habitus of the crystals. In addition, the solid state of the samples was studied by means of X-ray powder diffraction, differential scanning calorimetry, diffuse reflectance Fourier-transform infrared spectroscopy and hot stage microscopy. Finally, the in vitro dissolution tests were carried out. The solid state analysis of both samples untreated and treated with CO2, showed that the applied method caused a transition from the starting form III to the form I as well as determined a dramatic change of crystal morphology, resulting in needle-shaped crystals, regardless of the chosen solvent. In order to identify which process was responsible for the above results, carbamazepine was further precipitated from the same three solvents by traditional evaporation method (RV-samples). On the basis of this cross-testing, the solvents were found to be responsible for the reorganisation into a different polymorphic form, and the potential of the GAS process to produce micronic needle shaped particles, with an enhanced dissolution rate compared to the RV-carbamazepine, was ascertained.
Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.
Plakas, K V; Georgiadis, A A; Karabelas, A J
2016-01-01
The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.
Ramazani, Mohsen; Asnaashari, Mohammad; Ahmadi, Roghayyeh; Zarenejad, Nafiseh; Rafie, Alireza; Yazadani Charati, Jamshid
2018-01-01
This in vitro study aimed at comparing the effect of agitating the final irrigant solutions of root canal by ultrasonic or using 808 nm Diode laser on the apical seal of canal. A total of 90 extracted human maxillary central incisors were prepared up to size #45 and were randomly assigned to 4 experimental groups ( n =20) and two control groups ( n =5) respectively, as follows: I ): 3 mL of 5.25% NaOCl was agitated as final irrigant solution with ultrasonic for 30 sec. The ultrasonic tip was 1 mm shorter than the working length, II ): 3 mL of 5.25% NaOCl was agitated as final irrigant with 808 nm Diode laser for 30 sec. Fiber tip, placed in 1 mm shorter from working length was spirally moved coronally, III ): 3 mL of 17% EDTA was agitated as final irrigant with 808 nm Diode laser for 30 sec and was applied similar to group II, IV ): 3 mL of 17% EDTA was stimulated as final irrigant with ultrasonic for 30 sec and was applied similar to I. Apical seal was assessed by Dual Chamber technique using Bovine Serum Albumin protein. Kruskal-Wallis and Mann Whitney tests were used with significance level lower than 0.05% for statistical analysis. The average leakage in the negative control, positive control, and groups I, II, III, IV were: 0.00, 13.5±5.1, 1.72±2.9, 5.12±5.6, 3.36±3.7, 2.4±4.2, respectively. Statistical analysis showed significant difference between groups ( P <0.05). There was a significant difference between groups 1 and 2 in terms of protein leakage . Agitating 5.25% sodium hypochlorite solution as the final irrigant with ultrasonic is more effective in apical leakage reduction compared to other groups.
Hosseinzadeh, Hossein; Pashaei, Shahryar; Hosseinzadeh, Soleyman; Khodaparast, Zahra; Ramin, Sonia; Saadat, Younes
2018-05-31
In the present work, polymer-coated multiwalled carbon nanotube (MWCNT) was prepared via RAFT method. First, a novel trithiocarbonate-based RAFT agent was prepared attached chemically into the surface of MWCNT. In addition, the RAFT co-polymerization of acrylic acid and acrylamide monomers was conducted through the prepared RAFT agent. In the next age, the surface morphology and chemical properties of the prepared components were fully examined by using FTIR, 1 HNMR, SEM, TEM, XRD and TGA/DTG techniques. Finally, the modified MWCNT composite was employed as an excellent adsorbent for the adsorption of copper (II) ions. The results indicated that ion adsorption basically relies on adsorbing time, solution pH, initial copper concentration, and adsorbent dosage. Further, the adsorption kinetics and isotherm analysis demonstrated that the adsorption mode was fitted with the pseudo-second-order and Langmuir isotherm models, respectively. Based on the results of thermodynamic study, the ion adsorption process was endothermic and spontaneous. Finally, based on the experimental results, the surface functionalized MWCNT with hydrophilic groups could be successfully used as a promising selective adsorbent material in wastewater treatment. Copyright © 2018 Elsevier B.V. All rights reserved.
Prediction of properties of wheat dough using intelligent deep belief networks
NASA Astrophysics Data System (ADS)
Guha, Paramita; Bhatnagar, Taru; Pal, Ishan; Kamboj, Uma; Mishra, Sunita
2017-11-01
In this paper, the rheological and chemical properties of wheat dough are predicted using deep belief networks. Wheat grains are stored at controlled environmental conditions. The internal parameters of grains viz., protein, fat, carbohydrates, moisture, ash are determined using standard chemical analysis and viscosity of the dough is measured using Rheometer. Here, fat, carbohydrates, moisture, ash and temperature are considered as inputs whereas protein and viscosity are chosen as outputs. The prediction algorithm is developed using deep neural network where each layer is trained greedily using restricted Boltzmann machine (RBM) networks. The overall network is finally fine-tuned using standard neural network technique. In most literature, it has been found that fine-tuning is done using back-propagation technique. In this paper, a new algorithm is proposed in which each layer is tuned using RBM and the final network is fine-tuned using deep neural network (DNN). It has been observed that with the proposed algorithm, errors between the actual and predicted outputs are less compared to the conventional algorithm. Hence, the given network can be considered as beneficial as it predicts the outputs more accurately. Numerical results along with discussions are presented.
A multi-strategy approach to informative gene identification from gene expression data.
Liu, Ziying; Phan, Sieu; Famili, Fazel; Pan, Youlian; Lenferink, Anne E G; Cantin, Christiane; Collins, Catherine; O'Connor-McCourt, Maureen D
2010-02-01
An unsupervised multi-strategy approach has been developed to identify informative genes from high throughput genomic data. Several statistical methods have been used in the field to identify differentially expressed genes. Since different methods generate different lists of genes, it is very challenging to determine the most reliable gene list and the appropriate method. This paper presents a multi-strategy method, in which a combination of several data analysis techniques are applied to a given dataset and a confidence measure is established to select genes from the gene lists generated by these techniques to form the core of our final selection. The remainder of the genes that form the peripheral region are subject to exclusion or inclusion into the final selection. This paper demonstrates this methodology through its application to an in-house cancer genomics dataset and a public dataset. The results indicate that our method provides more reliable list of genes, which are validated using biological knowledge, biological experiments, and literature search. We further evaluated our multi-strategy method by consolidating two pairs of independent datasets, each pair is for the same disease, but generated by different labs using different platforms. The results showed that our method has produced far better results.
BahadarKhan, Khan; A Khaliq, Amir; Shahid, Muhammad
2016-01-01
Diabetic Retinopathy (DR) harm retinal blood vessels in the eye causing visual deficiency. The appearance and structure of blood vessels in retinal images play an essential part in the diagnoses of an eye sicknesses. We proposed a less computational unsupervised automated technique with promising results for detection of retinal vasculature by using morphological hessian based approach and region based Otsu thresholding. Contrast Limited Adaptive Histogram Equalization (CLAHE) and morphological filters have been used for enhancement and to remove low frequency noise or geometrical objects, respectively. The hessian matrix and eigenvalues approach used has been in a modified form at two different scales to extract wide and thin vessel enhanced images separately. Otsu thresholding has been further applied in a novel way to classify vessel and non-vessel pixels from both enhanced images. Finally, postprocessing steps has been used to eliminate the unwanted region/segment, non-vessel pixels, disease abnormalities and noise, to obtain a final segmented image. The proposed technique has been analyzed on the openly accessible DRIVE (Digital Retinal Images for Vessel Extraction) and STARE (STructured Analysis of the REtina) databases along with the ground truth data that has been precisely marked by the experts. PMID:27441646
Khachatryan, Vardan
2015-06-05
A search for a massive resonance decaying into a standard-model-like Higgs boson (H) and a W or Z boson is reported. The analysis is performed on a data sample corresponding to an integrated luminosity of 19.7 fb –1, collected in proton-proton collisions at a centre-of-mass energy of 8 TeV with the CMS detector at the LHC. Signal events, in which the decay products of Higgs, W, or Z bosons at high Lorentz boost are contained within single reconstructed jets, are identified using jet substructure techniques, including the tagging of b hadrons. This is the first search for heavy resonances decayingmore » in HW or HZ resulting in an all-jet final state, as well as the first application of jet substructure techniques to identify H → WW* → 4q decays at high Lorentz boost. Furthermore, no significant signal is observed and limits are set at 95% confidence level on the production cross section of W' and Z' in a model with mass-degenerate charged and neutral spin-1 resonances.« less
Latent transition analysis of pre-service teachers' efficacy in mathematics and science
NASA Astrophysics Data System (ADS)
Ward, Elizabeth Kennedy
This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the STEBI-B, MTEBI-r, and the ABNTMS instruments. The findings suggest that LTA is a viable technique for use in teacher efficacy research. Teacher efficacy is modeled as a construct with two dimensions: personal teaching efficacy (PTE) and outcome expectancy (OE). Findings suggest that the mathematics and science teaching efficacy (PTE) of pre-service teachers is a multi-class phenomena. The analyses revealed a four-class model of PTE at the beginning and end of the final year of teacher training. Results indicate that when pre-service teachers transition between classes, they tend to move from a lower efficacy class into a higher efficacy class. In addition, the findings suggest that time-varying variables (attitudes and beliefs) and time-invariant variables (previous coursework, previous experiences, and teacher perceptions) are statistically significant predictors of efficacy class membership. Further, analyses suggest that the measures used to assess outcome expectancy are not suitable for LCA and LTA procedures.
Investigation of acoustic emission coupling techniques
NASA Technical Reports Server (NTRS)
Jolly, W. D.
1988-01-01
A three-phase research program was initiated by NASA in 1983 to investigate the use of acoustic monitoring techniques to detect incipient failure in turbopump bearings. Two prototype acoustic coupler probes were designed and evaluated, and four units of the final probe design were fabricated. Success in this program could lead to development of an on-board monitor which could detect bearing damage in flight and reduce or eliminate the need for disassembly after each flight. This final report reviews the accomplishments of the first two phases and presents the results of fabrication and testing completed in the final phase of the research program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miron, M.S.; Christopher, C.; Hirshfield, S.
1978-05-01
Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less
NASA Astrophysics Data System (ADS)
Parente, Mario; Makarewicz, Heather D.; Bishop, Janice L.
2011-04-01
This study advances curve-fitting modeling of absorption bands of reflectance spectra and applies this new model to spectra of Martian meteorites ALH 84001 and EETA 79001 and data from the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM). This study also details a recently introduced automated parameter initialization technique. We assess the performance of this automated procedure by comparing it to the currently available initialization method and perform a sensitivity analysis of the fit results to variation in initial guesses. We explore the issues related to the removal of the continuum, offer guidelines for continuum removal when modeling the absorptions and explore different continuum-removal techniques. We further evaluate the suitability of curve fitting techniques using Gaussians/Modified Gaussians to decompose spectra into individual end-member bands. We show that nonlinear least squares techniques such as the Levenberg-Marquardt algorithm achieve comparable results to the MGM model ( Sunshine and Pieters, 1993; Sunshine et al., 1990) for meteorite spectra. Finally we use Gaussian modeling to fit CRISM spectra of pyroxene and olivine-rich terrains on Mars. Analysis of CRISM spectra of two regions show that the pyroxene-dominated rock spectra measured at Juventae Chasma were modeled well with low Ca pyroxene, while the pyroxene-rich spectra acquired at Libya Montes required both low-Ca and high-Ca pyroxene for a good fit.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
Assimilation of NUCAPS Retrieved Profiles in GSI for Unique Forecasting Applications
NASA Technical Reports Server (NTRS)
Berndt, Emily Beth; Zavodsky, Bradley; Srikishen, Jayanthi; Blankenship, Clay
2015-01-01
Hyperspectral IR profiles can be assimilated in GSI as a separate observation other than radiosondes with only changes to tables in the fix directory. Assimilation of profiles does produce changes to analysis fields and evidenced by: Innovations larger than +/-2.0 K are present and represent where individual profiles impact the final temperature analysis.The updated temperature analysis is colder behind the cold front and warmer in the warm sector. The updated moisture analysis is modified more in the low levels and tends to be drier than the original model background Analysis of model output shows: Differences relative to 13-km RAP analyses are smaller when profiles are assimilated with NUCAPS errors. CAPE is under-forecasted when assimilating NUCAPS profiles, which could be problematic for severe weather forecasting Refining the assimilation technique to incorporate an error covariance matrix and creating a separate GSI module to assimilate satellite profiles may improve results.
Integrating computer programs for engineering analysis and design
NASA Technical Reports Server (NTRS)
Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.
1983-01-01
The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.
Sulfur analysis by inductively coupled plasma-mass spectrometry: A review
NASA Astrophysics Data System (ADS)
Giner Martínez-Sierra, J.; Galilea San Blas, O.; Marchante Gayón, J. M.; García Alonso, J. I.
2015-06-01
In recent years the number of applications of sulfur (S) analysis using inductively coupled plasma mass spectrometry (ICP-MS) as detector has increased significantly. In this article we describe in some depth the application of ICP-MS for S analysis with emphasis placed on the sulfur-specific detection by hyphenated techniques such as LC, GC, CE and LA coupled on-line to ICP-MS. The different approaches available for sulfur isotope ratio measurements by ICP-MS are also detailed. Particular attention has been paid to the quantification of peptides/proteins and the analysis of metallopeptides/metalloproteins via sulfur by LC-ICP-MS. Likewise, the speciation analysis of metal-based pharmaceuticals and metallodrugs and non-metal selective detection of pharmaceuticals via S are highlighted. Labeling procedures for metabolic applications are also included. Finally, the measurement of natural variations in S isotope composition with multicollector ICP-MS instruments is also covered in this review.
Determination of carbohydrates in medicinal plants--comparison between TLC, mf-MELDI-MS and GC-MS.
Qureshi, Muhammad Nasimullah; Stecher, Guenther; Sultana, Tahira; Abel, Gudrun; Popp, Michael; Bonn, Guenther K
2011-01-01
Quality control in the pharmaceutical and phytopharmaceutical industries requires fast and reliable methods for the analysis of raw materials and final products. This study evaluates different analytical approaches in order to recognise the most suitable technique for the analysis of carbohydrates in herbal drug preparations. The specific focus of the study is on thin-layer chromatography (TLC), gas chromatography (GC), and a newly developed mass spectrometric method, i.e. matrix free material enhanced laser desorption/ionisation time of flight mass spectrometry (mf-MELDI-MS). Samples employed in the study were standards and microwave-assisted water extracts from Quercus. TLC analysis proved the presence of mono-, di- and trisaccharides within the biological sample and hinted at the existence of an unknown carbohydrate of higher oligomerisation degree. After evaluation of different derivatisation techniques, GC-MS confirmed data obtained via TLC for mono- to trisaccharides, delivering additionally quantified values under a considerable amount of time. A carbohydrate of higher oligomerisation degree could not be found. The application of mf-MELDI-MS further confirmed the presence of carbohydrates up to trisaccharides, also hinting at the presence of a form of tetrasaccharide. Besides this information, mf-MELDI-MS delivered further data about other substances present in the extract. Quantitative determination resulted in 1.750, 1.736 and 0.336 mg/mL for glucose, sucrose and raffinose respectively. Evaluation of all three techniques employed, clearly proved the heightened performance of mf-MELDI-MS for the qualitative analysis of complex mixtures, as targets do not need modification and analysis requires only a few minutes. In addition, GC-MS is suitable for quantitative analysis. Copyright © 2011 John Wiley & Sons, Ltd.
Techniques for information extraction from compressed GPS traces : final report.
DOT National Transportation Integrated Search
2015-12-31
Developing techniques for extracting information requires a good understanding of methods used to compress the traces. Many techniques for compressing trace data : consisting of position (i.e., latitude/longitude) and time values have been developed....
[Functional magnetic resonance imaging in psychiatry and psychotherapy].
Derntl, B; Habel, U; Schneider, F
2010-01-01
technical improvements, functional magnetic resonance imaging (fMRI) has become the most popular and versatile imaging method in psychiatric research. The scope of this manuscript is to briefly introduce the basics of MR physics, the blood oxygenation level-dependent (BOLD) contrast as well as the principles of MR study design and functional data analysis. The presentation of exemplary studies on emotion recognition and empathy in schizophrenia patients will highlight the importance of MR methods in psychiatry. Finally, we will demonstrate insights into new developments that will further boost MR techniques in clinical research and will help to gain more insight into dysfunctional neural networks underlying cognitive and emotional deficits in psychiatric patients. Moreover, some techniques such as neurofeedback seem promising for evaluation of therapy effects on a behavioral and neural level.
A combined analysis technique for the search for fast magnetic monopoles with the MACRO detector
NASA Astrophysics Data System (ADS)
MACRO Collaboration; Ambrosio, M.; Antolini, R.; Auriemma, G.; Bakari, D.; Baldini, A.; Barbarino, G. C.; Barish, B. C.; Battistoni, G.; Becherini, Y.; Bellotti, R.; Bemporad, C.; Bernardini, P.; Bilokon, H.; Bloise, C.; Bower, C.; Brigida, M.; Bussino, S.; Cafagna, F.; Calicchio, M.; Campana, D.; Carboni, M.; Caruso, R.; Cecchini, S.; Cei, F.; Chiarella, V.; Choudhary, B. C.; Coutu, S.; De Cataldo, G.; Dekhissi, H.; De Marzo, C.; De Mitri, I.; Derkaoui, J.; De Vincenzi, M.; DiCredico, A.; Erriquez, O.; Favuzzi, C.; Forti, C.; Fusco, P.; Giacomelli, G.; Giannini, G.; Giglietto, N.; Giorgini, M.; Grassi, M.; Grillo, A.; Guarino, F.; Gustavino, C.; Habig, A.; Heinz, R.; Iarocci, E.; Katsavounidis, E.; Katsavounidis, I.; Kearns, E.; Kim, H.; Kyriazopoulou, S.; Lamanna, E.; Lane, C.; Levin, D. S.; Lipari, P.; Longley, N. P.; Longo, M. J.; Loparco, F.; Maaroufi, F.; Mancarella, G.; Mandrioli, G.; Manzoor, S.; Margiotta, A.; Marini, A.; Martello, D.; Marzari-Chiesa, A.; Mazziotta, M. N.; Michael, D. G.; Monacelli, P.; Montaruli, T.; Monteno, M.; Mufson, S.; Musser, J.; Nicolò, D.; Nolty, R.; Orth, C.; Osteria, G.; Palamara, O.; Patera, V.; Patrizii, L.; Pazzi, R.; Peck, C. W.; Perrone, L.; Petrera, S.; Popa, V.; Reynoldson, J.; Ronga, F.; Rrhioua, A.; Satriano, C.; Scapparone, E.; Scholberg, K.; Sciubba, A.; Serra, P.; Sioli, M.; Sirri, G.; Sitta, M.; Spinelli, P.; Spinetti, M.; Spurio, M.; Steinberg, R.; Stone, J. L.; Sulak, L. R.; Surdo, A.; Tarlè, G.; Togo, V.; Vakili, M.; Walter, C. W.; Webb, R.
2002-08-01
We describe a search method for fast moving (β=v/c>5×10-3) magnetic monopoles using simultaneously the scintillator, streamer tube and track-etch subdetectors of the MACRO apparatus. The first two subdetectors are used primarily for the identification of candidates while the track-etch one is used as the final tool for their rejection or confirmation. Using this technique, a first sample of more than two years of data has been analyzed without any evidence of a magnetic monopole. We set a 90% CL upper limit to the local monopole flux of 1.5×10-15 cm-2s-1sr-1 in the velocity range 5×10-3<=β<=0.99 and for nucleon decay catalysis cross-section smaller than /~1 mb
Dryland pasture and crop conditions as seen by HCMM. [Washita Watershed, Oklahoma
NASA Technical Reports Server (NTRS)
Harlan, J. C. (Principal Investigator); Rosenthal, W. D.; Blanchard, B. J.
1981-01-01
Techniques developed from aircraft flights over the Washita watershed in central Oklahoma were applied to HCMM data analysis. Results show that (1) canopy temperatures were accurately measured remotely; (2) pasture surface temperature differences detected relative soil moisture differences; (3) pasture surface temperature differences were related to stress in nearby wheat fields; and (4) no relationship was developed between final yield differences, thermal infrared data, and soil moisture stress at critical growth stages due to a lack of satellite thermal data at critical growth stages. The HCMM thermal data proved to be quite adequate in detecting relative moisture differences; however, with a 16 day day/night overpass frequency, more frequent overpasses are required to analyze more cases within a 7 day period after the storm. Better normalization techniques are also required.
Optically gated beating-heart imaging
Taylor, Jonathan M.
2014-01-01
The constant motion of the beating heart presents an obstacle to clear optical imaging, especially 3D imaging, in small animals where direct optical imaging would otherwise be possible. Gating techniques exploit the periodic motion of the heart to computationally “freeze” this movement and overcome motion artifacts. Optically gated imaging represents a recent development of this, where image analysis is used to synchronize acquisition with the heartbeat in a completely non-invasive manner. This article will explain the concept of optical gating, discuss a range of different implementation strategies and their strengths and weaknesses. Finally we will illustrate the usefulness of the technique by discussing applications where optical gating has facilitated novel biological findings by allowing 3D in vivo imaging of cardiac myocytes in their natural environment of the beating heart. PMID:25566083
NASA Technical Reports Server (NTRS)
Sung, Q. C.; Miller, L. D.
1977-01-01
Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.
Preparing Colorful Astronomical Images III: Cosmetic Cleaning
NASA Astrophysics Data System (ADS)
Frattare, L. M.; Levay, Z. G.
2003-12-01
We present cosmetic cleaning techniques for use with mainstream graphics software (Adobe Photoshop) to produce presentation-quality images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope when producing photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to discuss the treatment of various detector-attributed artifacts such as cosmic rays, chip seams, gaps, optical ghosts, diffraction spikes and the like. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to final presentation images. Other pixel-to-pixel applications such as filter smoothing and global noise reduction will be discussed.
High-Power Microwave Transmission and Mode Conversion Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vernon, Ronald J.
2015-08-14
This is a final technical report for a long term project to develop improved designs and design tools for the microwave hardware and components associated with the DOE Plasma Fusion Program. We have developed basic theory, software, fabrication techniques, and low-power measurement techniques for the design of microwave hardware associated gyrotrons, microwave mode converters and high-power microwave transmission lines. Specifically, in this report we discuss our work on designing quasi-optical mode converters for single and multiple frequencies, a new method for the analysis of perturbed-wall waveguide mode converters, perturbed-wall launcher design for TE0n mode gyrotrons, quasi-optical traveling-wave resonator design formore » high-power testing of microwave components, and possible improvements to the HSX microwave transmission line.« less
Reducing the Requirements and Cost of Astronomical Telescopes
NASA Technical Reports Server (NTRS)
Smith, W. Scott; Whitakter, Ann F. (Technical Monitor)
2002-01-01
Limits on astronomical telescope apertures are being rapidly approached. These limits result from logistics, increasing complexity, and finally budgetary constraints. In an historical perspective, great strides have been made in the area of aperture, adaptive optics, wavefront sensors, detectors, stellar interferometers and image reconstruction. What will be the next advances? Emerging data analysis techniques based on communication theory holds the promise of yielding more information from observational data based on significant computer post-processing. This paper explores some of the current telescope limitations and ponders the possibilities increasing the yield of scientific data based on the migration computer post-processing techniques to higher dimensions. Some of these processes hold the promise of reducing the requirements on the basic telescope hardware making the next generation of instruments more affordable.
NASA Astrophysics Data System (ADS)
Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.
2012-12-01
Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.
Application of multivariable search techniques to structural design optimization
NASA Technical Reports Server (NTRS)
Jones, R. T.; Hague, D. S.
1972-01-01
Multivariable optimization techniques are applied to a particular class of minimum weight structural design problems: the design of an axially loaded, pressurized, stiffened cylinder. Minimum weight designs are obtained by a variety of search algorithms: first- and second-order, elemental perturbation, and randomized techniques. An exterior penalty function approach to constrained minimization is employed. Some comparisons are made with solutions obtained by an interior penalty function procedure. In general, it would appear that an interior penalty function approach may not be as well suited to the class of design problems considered as the exterior penalty function approach. It is also shown that a combination of search algorithms will tend to arrive at an extremal design in a more reliable manner than a single algorithm. The effect of incorporating realistic geometrical constraints on stiffener cross-sections is investigated. A limited comparison is made between minimum weight cylinders designed on the basis of a linear stability analysis and cylinders designed on the basis of empirical buckling data. Finally, a technique for locating more than one extremal is demonstrated.
Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.
Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J
2015-01-01
The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.
Advanced techniques for characterization of ion beam modified materials
Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; ...
2014-10-30
Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiationmore » effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.« less
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
Infrared/microwave (IR/MW) micromirror array beam combiner design and analysis.
Tian, Yi; Lv, Lijun; Jiang, Liwei; Wang, Xin; Li, Yanhong; Yu, Haiming; Feng, Xiaochen; Li, Qi; Zhang, Li; Li, Zhuo
2013-08-01
We investigated the design method of an infrared (IR)/microwave (MW) micromirror array type of beam combiner. The size of micromirror is in microscopic levels and comparable to MW wavelengths, so that the MW will not react in these dimensions, whereas the much shorter optical wavelengths will be reflected by them. Hence, the MW multilayered substrate was simplified and designed using transmission line theory. The beam combiner used an IR wavefront-division imaging technique to reflect the IR radiation image to the unit under test (UUT)'s pupil in a parallel light path. In addition, the boresight error detected by phase monopulse radar was analyzed using a moment-of method (MoM) and multilevel fast multipole method (MLFMM) acceleration technique. The boresight error introduced by the finite size of the beam combiner was less than 1°. Finally, in order to verify the wavefront-division imaging technique, a prototype of a micromirror array was fabricated, and IR images were tested. The IR images obtained by the thermal imager verified the correctness of the wavefront-division imaging technique.
A simple method for processing data with least square method
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning
2017-08-01
The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.
Impulsive synchronization of stochastic reaction-diffusion neural networks with mixed time delays.
Sheng, Yin; Zeng, Zhigang
2018-07-01
This paper discusses impulsive synchronization of stochastic reaction-diffusion neural networks with Dirichlet boundary conditions and hybrid time delays. By virtue of inequality techniques, theories of stochastic analysis, linear matrix inequalities, and the contradiction method, sufficient criteria are proposed to ensure exponential synchronization of the addressed stochastic reaction-diffusion neural networks with mixed time delays via a designed impulsive controller. Compared with some recent studies, the neural network models herein are more general, some restrictions are relaxed, and the obtained conditions enhance and generalize some published ones. Finally, two numerical simulations are performed to substantiate the validity and merits of the developed theoretical analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-10-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.
QFD analysis of RSRM aqueous cleaners
NASA Technical Reports Server (NTRS)
Marrs, Roy D.; Jones, Randy K.
1995-01-01
This paper presents a Quality Function Deployment (QFD) analysis of the final down-selected aqueous cleaners to be used on the Redesigned Solid Rocket Motor (RSRM) program. The new cleaner will replace solvent vapor degreasing. The RSRM Ozone Depleting Compound Elimination program is discontinuing the methyl chloroform vapor degreasing process and replacing it with a spray-in-air aqueous cleaning process. Previously, 15 cleaners were down-selected to two candidates by passing screening tests involving toxicity, flammability, cleaning efficiency, contaminant solubility, corrosion potential, cost, and bond strength. The two down-selected cleaners were further evaluated with more intensive testing and evaluated using QFD techniques to assess suitability for cleaning RSRM case and nozzle surfaces in preparation for adhesive bonding.
Image analysis by integration of disparate information
NASA Technical Reports Server (NTRS)
Lemoigne, Jacqueline
1993-01-01
Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.
Classification of fMRI resting-state maps using machine learning techniques: A comparative study
NASA Astrophysics Data System (ADS)
Gallos, Ioannis; Siettos, Constantinos
2017-11-01
We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-01-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638
Stochastic analysis of a novel nonautonomous periodic SIRI epidemic system with random disturbances
NASA Astrophysics Data System (ADS)
Zhang, Weiwei; Meng, Xinzhu
2018-02-01
In this paper, a new stochastic nonautonomous SIRI epidemic model is formulated. Given that the incidence rates of diseases may change with the environment, we propose a novel type of transmission function. The main aim of this paper is to obtain the thresholds of the stochastic SIRI epidemic model. To this end, we investigate the dynamics of the stochastic system and establish the conditions for extinction and persistence in mean of the disease by constructing some suitable Lyapunov functions and using stochastic analysis technique. Furthermore, we show that the stochastic system has at least one nontrivial positive periodic solution. Finally, numerical simulations are introduced to illustrate our results.
NASA Astrophysics Data System (ADS)
Chen, Hansheng; Yun, Fan; Qu, Jiangtao; Li, Yingfei; Cheng, Zhenxiang; Fang, Ruhao; Ye, Zhixiao; Ringer, Simon P.; Zheng, Rongkun
2018-05-01
Quantitative correlation between intrinsic coercivity and grain boundaries in three dimensions is critical to further improve the performance of sintered Nd-Fe-B permanent magnets. Here, we quantitatively reveal the local composition variation across and especially along grain boundaries using the powerful atomic-scale analysis technique known as atom probe tomography. We also estimate the saturation magnetization, magnetocrystalline anisotropy constant, and exchange stiffness of the grain boundaries on the basis of the experimentally determined structure and composition. Finally, using micromagnetic simulations, we quantify the intrinsic coercivity degradation caused by inhomogeneous grain boundaries. This approach can be applied to other magnetic materials for the analysis and optimization of magnetic properties.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
Zhang, Xue-Jun; Li, Hai-Ling; Deng, Da-Yi; Ji, Chong; Yao, Xiao-Dong; Liu, Jia-Xin
2018-05-29
Tetanus is still a major cause of human deaths in several developing countries. In particular, the neonatal form remains a significant public health problem. According to the World Health Organization, administration of tetanus toxoid is recommended for neonatal tetanus patients. Furthermore, tetanus antitoxin or anti-tetanus immunoglobulin (Ig) are used for mild case or intensive care. This paper discusses a novel purification technique for improving equine anti-tetanus Ig production. First, equine plasma dealt with two steps salting out with ammonium sulfate; second, ultrafiltration concentration liquid purified by one successive protein G based affinity chromatography steps; finally, the purified F(ab')2 fragments was characterized using biochemical and proteomic methods and shown to be pure and homogeneous. Compared with the original technique product, specific activity increased by 80% (about 90,000 IU/g) and recovery of F(ab')2 is approximately equal 75%. Furthermore, Proteomic profiling of total technique process is demonstrated by nano-HPLC-MS and bioinformatics analysis. New technique to produce equine anti-tetanus immunoglobulin F(ab')2 fragments from crude plasma in high quality and yield. And it also could be used for industrial amplification. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bo, Z.; Chen, J. H.
2010-02-01
The dimensional analysis technique is used to formulate a correlation between ozone generation rate and various parameters that are important in the design and operation of positive wire-to-plate corona discharges in indoor air. The dimensionless relation is determined by linear regression analysis based on the results from 36 laboratory-scale experiments. The derived equation is validated by experimental data and a numerical model published in the literature. Applications of such derived equation are illustrated through an example selection of the appropriate set of operating conditions in the design/operation of a photocopier to follow the federal regulations of ozone emission. Finally, a new current-voltage characteristic equation is proposed for positive wire-to-plate corona discharges based on the derived dimensionless equation.
de Paiva, Anderson Paulo
2018-01-01
This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939
Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Lynch, Christopher J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing
NASA Technical Reports Server (NTRS)
Struk, Peter, M; Lynch, Christopher, J.
2012-01-01
This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.
Further studies on stability analysis of nonlinear Roesser-type two-dimensional systems
NASA Astrophysics Data System (ADS)
Dai, Xiao-Lin
2014-04-01
This paper is concerned with further relaxations of the stability analysis of nonlinear Roesser-type two-dimensional (2D) systems in the Takagi-Sugeno fuzzy form. To achieve the goal, a novel slack matrix variable technique, which is homogenous polynomially parameter-dependent on the normalized fuzzy weighting functions with arbitrary degree, is developed and the algebraic properties of the normalized fuzzy weighting functions are collected into a set of augmented matrices. Consequently, more information about the normalized fuzzy weighting functions is involved and the relaxation quality of the stability analysis is significantly improved. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed result.
Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek
2018-02-01
Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dissanayake, A.; AlFaify, S.; Garratt, E.; Nandasiri, M. I.; Taibu, R.; Tecos, G.; Hamdan, N. M.; Kayani, A.
2011-06-01
Thin, hydrogenated aluminum hydride films were deposited on silicon substrates using unbalanced magnetron (UBM) sputtering of a high purity aluminum target under electrically grounded conditions. Argon was used as sputtering gas and hydrogenation was carried out by diluting the growth plasma with hydrogen. The effect of hydrogen partial pressure on the final concentration of trapped elements including hydrogen has been studied using ion beam analysis (IBA) techniques. Moreover, in-situ thermal stability of trapped hydrogen in the film was carried out using Rutherford Backscattering Spectrometry (RBS), Non-Rutherford Backscattering Spectrometry (NRBS) and Elastic Recoil Detection Analysis (ERDA). Microstructure of the film was investigated by SEM analysis. Hydrogen content in the thin films was found decreasing as the films were heated above 110 °C in vacuum.
Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures
NASA Technical Reports Server (NTRS)
Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel
1990-01-01
A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.
NASA Astrophysics Data System (ADS)
Verhiest, K.; Mullens, S.; De Wispelaere, N.; Claessens, S.; DeBremaecker, A.; Verbeken, K.
2012-09-01
In this study, oxide dispersion strengthened (ODS) 316L steel samples were manufactured by the 3 dimensional fiber deposition (3DFD) technique. The performance of 3DFD as colloidal consolidation technique to obtain porous green bodies based on yttria (Y2O3) nano-slurries or paste, is discussed within this experimental work. The influence of the sintering temperature and time on sample densification and grain growth was investigated in this study. Hot consolidation was performed to obtain final product quality in terms of residual porosity reduction and final dispersion homogeneity.
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
Peters, Timothy
2015-04-01
Recent studies have shown that the claim that King George III suffered from acute porphyria is seriously at fault. This article explores some of the causes of this misdiagnosis and the consequences of the misleading claims, also reporting on the nature of the king's recurrent mental illness according to computer diagnostics. In addition, techniques of cognitive archaeology are used to investigate the nature of the king's final decade of mental illness, which resulted in the appointment of the Prince of Wales as Prince Regent. The results of this analysis confirm that the king suffered from bipolar disorder type I, with a final decade of dementia, due, in part, to the neurotoxicity of his recurrent episodes of acute mania. © 2015 Royal College of Physicians.
Technique and final cause in psychoanalysis: four ways of looking at one moment.
Lear, Jonathan
2009-12-01
This paper argues that if one considers just a single clinical moment there may be no principled way to choose among different approaches to psychoanalytic technique. One must in addition take into account what Aristotle called the final cause of psychoanalysis, which this paper argues is freedom. However, freedom is itself an open-ended concept with many aspects that need to be explored and developed from a psychoanalytic perspective. This paper considers one analytic moment from the perspectives of the techniques of Paul Gray, Hans Loewald, the contemporary Kleinians and Jacques Lacan. It argues that, if we are to evaluate these techniques, we must take into account the different conceptions of freedom they are trying to facilitate.
Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul
2012-01-01
Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.
2012-01-01
Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037
The Analysis of a Vortex Type Magnetohydrodynamic Induction Generator
NASA Technical Reports Server (NTRS)
Lengyel, L. L.
1962-01-01
Consideration it is given to the performance to the characteristics of an AC magnetohydrodynamic power generator, A rotating magnetic field is imposed on the vortex flow of an electrically conducting fluid, which is injected tangentially into an annulus formed by two nonconducting concentric cylinders and two nonconducting end plates. A perturbation technique is used to determine the two dimensional velocity and three dimensional electromagnetic field and current distributions. Finally, the generated power, the ohmic losses, the effective power and the electrical efficiency of the converter system are calculated.
Sathiyaraj, T; Balasubramaniam, P
2017-11-30
This paper presents a new set of sufficient conditions for controllability of fractional higher order stochastic integrodifferential systems with fractional Brownian motion (fBm) in finite dimensional space using fractional calculus, fixed point technique and stochastic analysis approach. In particular, we discuss the complete controllability for nonlinear fractional stochastic integrodifferential systems under the proved result of the corresponding linear fractional system is controllable. Finally, an example is presented to illustrate the efficiency of the obtained theoretical results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
The development of advanced manufacturing systems
NASA Astrophysics Data System (ADS)
Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel
Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.
An analysis of the development of port operation in Da Nang Port, Vietnam
NASA Astrophysics Data System (ADS)
Nguyen, T. D. H.; Cools, M.
2018-04-01
This paper presents the current operating status in Da Nang Port, Vietnam in the period 2012-2016. The port operation had positive changes that were reflected by a significant increase in total throughputs, especially containerized cargo volumes. Classical decomposition techniques are used to find trend-cycle and seasonal components of monthly throughput flows. Appropriate predictive models of different kinds of throughputs are proposed. Finally, a development strategy towards containerization and investment policies in facilities, equipment, and infrastructure are suggested based on the predictive results.
Enactments in Psychoanalysis: Therapeutic Benefits.
Stern, Stanley
The therapeutic benefits of enactments are addressed. Relevant literature reveals disparate conceptions about the nature and use of enactments. Clarification of the term is discussed. This analyst's theoretical and technical evolution is addressed; it is inextricably related to using enactments. How can it not be? A taxonomy of enactments is presented. The article considers that enactments may be fundamental in the evolution from orthodox to contemporary analytic technique. Assumptions underlying enactments are explored, as are guidelines for using enactments. Finally, the article posits that enactments have widened the scope of analysis and contributed to its vitality.
Perisic, Nebojsa; Afseth, Nils Kristian; Ofstad, Ragni; Hassani, Sahar; Kohler, Achim
2013-05-01
In this paper a combination of NIR spectroscopy and FTIR and Raman microspectroscopy was used to elucidate the effects of different salts (NaCl, KCl and MgSO(4)) on structural proteins and their hydration in muscle tissue. Multivariate multi-block technique Consensus Principal Component Analysis enabled integration of different vibrational spectroscopic techniques: macroscopic information obtained by NIR spectroscopy is directly related to microscopic information obtained by FTIR and Raman microspectroscopy. Changes in protein secondary structure observed at different concentrations of salts were linked to changes in protein hydration affinity. The evidence for this was given by connecting the underlying FTIR bands of the amide I region (1700-1600 cm(-1)) and the water region (3500-3000 cm(-1)) with water vibrations obtained by NIR spectroscopy. In addition, Raman microspectroscopy demonstrated that different cations affected structures of aromatic amino acid residues differently, which indicates that cation-π interactions play an important role in determination of the final structure of protein molecules. Copyright © 2012 Elsevier Ltd. All rights reserved.