Sample records for established analysis techniques

  1. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  2. Methodology for assessing the effectiveness of access management techniques : executive summary.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  3. Methodology for assessing the effectiveness of access management techniques : final report, September 14, 1998.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  4. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  5. Current concepts in cleft care: A multicenter analysis.

    PubMed

    Thiele, Oliver C; Kreppel, Matthias; Dunsche, Anton; Eckardt, Andre M; Ehrenfeld, Michael; Fleiner, Bernd; Gaßling, Volker; Gehrke, Gerd; Gerressen, Marcus; Gosau, Martin; Gröbe, Alexander; Haßfeld, Stefan; Heiland, Max; Hoffmeister, Bodo; Hölzle, Frank; Klein, Cornelius; Krüger, Maximilian; Kübler, Alexander C; Kübler, Norbert R; Kuttenberger, Johannes J; Landes, Constantin; Lauer, Günter; Martini, Markus; Merholz, Erich T; Mischkowski, Robert A; Al-Nawas, Bilal; Nkenke, Emeka; Piesold, Jörn U; Pradel, Winnie; Rasse, Michael; Rachwalski, Martin; Reich, Rudolf H; Rothamel, Daniel; Rustemeyer, Jan; Scheer, Martin; Schliephake, Henning; Schmelzeisen, Rainer; Schramm, Alexander; Schupp, Wiebke; Spitzer, Wolfgang J; Stocker, Erwin; Stoll, Christian; Terheyden, Hendrik; Voigt, Alexander; Wagner, Wilfried; Weingart, Dieter; Werkmeister, Richard; Wiltfang, Jörg; Ziegler, Christoph M; Zöller, Joachim E

    2018-04-01

    The current surgical techniques used in cleft repair are well established, but different centers use different approaches. To determine the best treatment for patients, a multi-center comparative study is required. In this study, we surveyed all craniofacial departments registered with the German Society of Maxillofacial Surgery to determine which cleft repair techniques are currently in use. Our findings revealed much variation in cleft repair between different centers. Although most centers did use a two-stage approach, the operative techniques and timing of lip and palate closure were different in every center. This shows that a retrospective comparative analysis of patient outcome between the participating centers is not possible and illustrates the need for prospective comparative studies to establish the optimal technique for reconstructive cleft surgery. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  6. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  7. Applications of multi-frequency single beam sonar fisheries analysis methods for seep quantification and characterization

    NASA Astrophysics Data System (ADS)

    Price, V.; Weber, T.; Jerram, K.; Doucet, M.

    2016-12-01

    The analysis of multi-frequency, narrow-band single-beam acoustic data for fisheries applications has long been established, with methodology focusing on characterizing targets in the water column by utilizing complex algorithms and false-color time series data to create and compare frequency response curves for dissimilar biological groups. These methods were built on concepts developed for multi-frequency analysis of satellite imagery for terrestrial analysis and have been applied to a broad range of data types and applications. Single-beam systems operating at multiple frequencies are also used for the detection and identification of seeps in water column data. Here we incorporate the same analysis and visualization techniques used for fisheries applications to attempt to characterize and quantify seeps by creating and comparing frequency response curves and applying false coloration to shallow and deep multi-channel seep data. From this information, we can establish methods to differentiate bubble size in the echogram and differentiate seep composition. These techniques are also useful in differentiating plume content from biological noise (volume reverberation) created by euphausid layers and fish with gas-filled swim bladders. The combining of the multiple frequencies using false coloring and other image analysis techniques after applying established normalization and beam pattern correction algorithms is a novel approach to quantitatively describing seeps. Further, this information could be paired with geological models, backscatter, and bathymetry data to assess seep distribution.

  8. Panel Discussion on Multi-Disciplinary Analysis

    NASA Technical Reports Server (NTRS)

    Garcia, Robert

    2002-01-01

    The Marshall Space Flight Center (MSFC) is hosting the Thermal and Fluids Analysis Workshop (TFAWS) during the week of September 10, 2001. Included in this year's TFAWS is a panel session on Multidisciplinary Analysis techniques. The intent is to provide an opportunity for the users to gain information as to what product may be best suited for their applications environment and to provide feedback to you, the developers, on future desired developments. Potential users of multidisciplinary analysis (MDA) techniques are often overwhelmed by the number of choices available to them via commercial products and by the pace of new developments in this area. The purpose of this panel session is to provide a forum wherein MDA tools available and under development can be discussed, compared, and contrasted. The intent of this panel is to provide the end-user with the information necessary to make educated decisions on how to proceed with selecting their MDA tool. It is anticipated that the discussions this year will focus on MDA techniques that couple discipline codes or algorithms (as opposed to monolithic, unified MDA approaches). The MDA developers will be asked to prepare a product overview presentation addressing specific questions provided by the panel organizers. The purpose of these questions will be to establish the method employed by the particular MDA technique for communication between the discipline codes, to establish the similarities and differences amongst the various approaches, and to establish the range of experience and applications for each particular MDA approach.

  9. Integration of different data gap filling techniques to facilitate assessment of polychlorinated biphenyls: A proof of principle case study (ASCCT meeting)

    EPA Science Inventory

    Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...

  10. Summary of 1971 water remote sensing investigations

    NASA Technical Reports Server (NTRS)

    Tilton, E. L., III

    1972-01-01

    The Earth Resources Laboratory sea remote sensing program has concentrated on project planning, data acquisition procedures, and data preparation techniques to establish a firm procedural basis for the program. Most of these procedural elements were established and proven during the three missions conducted in 1971. It is anticipated that the program in 1972 will see the analysis completed on the Mississippi Sound series and the first series of Eastern Gulf experiments allowing increased emphasis to be given to more intensive technique development studies, the interrelationship of parameters for the measurement and prediction of water circulation, and the demonstration of the application of these techniques.

  11. Hazards and benefits of in-vivo Raman spectroscopy of human skin

    NASA Astrophysics Data System (ADS)

    Carter, Elizabeth A.; Williams, Adrian C.; Barry, Brian W.; Edwards, Howell G.

    1999-04-01

    The resurgence of Raman spectroscopy, in the late 1980's has led to an increase in the use of the technique for the analysis of biological tissues. Consequently, Raman spectroscopy is now regarded to be a well-established non- invasive, non-destructive technique, which is used to obtain good quality spectra from biological tissues with minimal fluorescence. What is presently of interest to our group is to develop further and establish the technique for in vivo investigations of healthy and diseased skin. This presentation discusses some potentially valuable clinical applications of the technique, and also highlights some of the experimental difficulties that were encountered when examining patients who were receiving treatment for psoriasis.

  12. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  13. Naval War College Review. Volume 67, Number 1, Winter 2014

    DTIC Science & Technology

    2014-01-01

    squishier” terms, phrases, and concepts yielded by qualitative techniques, such as grounded theory , ethnography , case studies , or content analysis.43...variety of established, qualitative techniques (i.e., grounded theory , content analysis, and survey research ) to “triangulate” the game’s findings...mari- time research , regional studies , distance education, war gaming, and education/ programs at the operational level of war. each of these intricate

  14. Adhesion, friction, wear, and lubrication research by modern surface science techniques.

    NASA Technical Reports Server (NTRS)

    Keller, D. V., Jr.

    1972-01-01

    The field of surface science has undergone intense revitalization with the introduction of low-energy electron diffraction, Auger electron spectroscopy, ellipsometry, and other surface analytical techniques which have been sophisticated within the last decade. These developments have permitted submono- and monolayer structure analysis as well as chemical identification and quantitative analysis. The application of a number of these techniques to the solution of problems in the fields of friction, lubrication, and wear are examined in detail for the particular case of iron; and in general to illustrate how the accumulation of pure data will contribute toward the establishment of physiochemical concepts which are required to understand the mechanisms that are operational in friction systems. In the case of iron, LEED, Auger and microcontact studies have established that hydrogen and light-saturated organic vapors do not establish interfaces which prevent iron from welding, whereas oxygen and some oxygen and sulfur compounds do reduce welding as well as the coefficient of friction. Interpretation of these data suggests a mechanism of sulfur interaction in lubricating systems.

  15. The Effect of Multispectral Image Fusion Enhancement on Human Efficiency

    DTIC Science & Technology

    2017-03-20

    human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of

  16. Report of the panel on international programs

    NASA Technical Reports Server (NTRS)

    Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman

    1991-01-01

    The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.

  17. Heuristics to Facilitate Understanding of Discriminant Analysis.

    ERIC Educational Resources Information Center

    Van Epps, Pamela D.

    This paper discusses the principles underlying discriminant analysis and constructs a simulated data set to illustrate its methods. Discriminant analysis is a multivariate technique for identifying the best combination of variables to maximally discriminate between groups. Discriminant functions are established on existing groups and used to…

  18. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  19. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    NASA Astrophysics Data System (ADS)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  20. Trace-fiber color discrimination by electrospray ionization mass spectrometry: a tool for the analysis of dyes extracted from submillimeter nylon fibers.

    PubMed

    Tuinman, Albert A; Lewis, Linda A; Lewis, Samuel A

    2003-06-01

    The application of electrospray ionization mass spectrometry (ESI-MS) to trace-fiber color analysis is explored using acidic dyes commonly employed to color nylon-based fibers, as well as extracts from dyed nylon fibers. Qualitative information about constituent dyes and quantitative information about the relative amounts of those dyes present on a single fiber become readily available using this technique. Sample requirements for establishing the color identity of different samples (i.e., comparative trace-fiber analysis) are shown to be submillimeter. Absolute verification of dye mixture identity (beyond the comparison of molecular weights derived from ESI-MS) can be obtained by expanding the technique to include tandem mass spectrometry (ESI-MS/MS). For dyes of unknown origin, the ESI-MS/MS analyses may offer insights into the chemical structure of the compound-information not available from chromatographic techniques alone. This research demonstrates that ESI-MS is viable as a sensitive technique for distinguishing dye constituents extracted from a minute amount of trace-fiber evidence. A protocol is suggested to establish/refute the proposition that two fibers--one of which is available in minute quantity only--are of the same origin.

  1. Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods

    PubMed Central

    Punshon, Tracy

    2015-01-01

    Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012

  2. A rational framework for production decision making in blood establishments.

    PubMed

    Ramoa, Augusto; Maia, Salomé; Lourenço, Anália

    2012-07-24

    SAD_BaSe is a blood bank data analysis software, created to assist in the management of blood donations and the blood production chain in blood establishments. In particular, the system keeps track of several collection and production indicators, enables the definition of collection and production strategies, and the measurement of quality indicators required by the Quality Management System regulating the general operation of blood establishments. This paper describes the general scenario of blood establishments and its main requirements in terms of data management and analysis. It presents the architecture of SAD_BaSe and identifies its main contributions. Specifically, it brings forward the generation of customized reports driven by decision making needs and the use of data mining techniques in the analysis of donor suspensions and donation discards.

  3. A Rational Framework for Production Decision Making in Blood Establishments.

    PubMed

    Ramoa, Augusto; Maia, Salomé; Lourenço, Anália

    2012-12-01

    SAD_BaSe is a blood bank data analysis software, created to assist in the management of blood donations and the blood production chain in blood establishments. In particular, the system keeps track of several collection and production indicators, enables the definition of collection and production strategies, and the measurement of quality indicators required by the Quality Management System regulating the general operation of blood establishments. This paper describes the general scenario of blood establishments and its main requirements in terms of data management and analysis. It presents the architecture of SAD_BaSe and identifies its main contributions. Specifically, it brings forward the generation of customized reports driven by decision making needs and the use of data mining techniques in the analysis of donor suspensions and donation discards.

  4. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  5. [The future of forensic DNA analysis for criminal justice].

    PubMed

    Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent

    2017-11-01

    In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.

  6. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  7. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  8. 38 CFR 1.921 - Analysis of costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... effectiveness of alternative collection techniques, establish guidelines with respect to points at which costs... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Analysis of costs. 1.921... Standards for Collection of Claims § 1.921 Analysis of costs. VA collection procedures should provide for...

  9. LIVER ULTRASONOGRAPHY IN DOLPHINS: USE OF ULTRASONOGRAPHY TO ESTABLISH A TECHNIQUE FOR HEPATOBILIARY IMAGING AND TO EVALUATE METABOLIC DISEASE-ASSOCIATED LIVER CHANGES IN BOTTLENOSE DOLPHINS (TURSIOPS TRUNCATUS).

    PubMed

    Seitz, Kelsey E; Smith, Cynthia R; Marks, Stanley L; Venn-Watson, Stephanie K; Ivančić, Marina

    2016-12-01

    The objective of this study was to establish a comprehensive technique for ultrasound examination of the dolphin hepatobiliary system and apply this technique to 30 dolphins to determine what, if any, sonographic changes are associated with blood-based indicators of metabolic syndrome (insulin greater than 14 μIU/ml or glucose greater than 112 mg/dl) and iron overload (transferrin saturation greater than 65%). A prospective study of individuals in a cross-sectional population with and without elevated postprandial insulin levels was performed. Twenty-nine bottlenose dolphins ( Tursiops truncatus ) in a managed collection were included in the final data analysis. An in-water ultrasound technique was developed that included detailed analysis of the liver and pancreas. Dolphins with hyperinsulinemia concentrations had larger livers compared with dolphins with nonelevated concentrations. Using stepwise, multivariate regression including blood-based indicators of metabolic syndrome in dolphins, glucose was the best predictor of and had a positive linear association with liver size (P = 0.007, R 2 = 0.24). Bottlenose dolphins are susceptible to metabolic syndrome and associated complications that affect the liver, including fatty liver disease and iron overload. This study facilitated the establishment of a technique for a rapid, diagnostic, and noninvasive ultrasonographic evaluation of the dolphin liver. In addition, the study identified ultrasound-detectable hepatic changes associated primarily with elevated glucose concentration in dolphins. Future investigations will strive to detail the pathophysiological mechanisms for these changes.

  10. Performance analysis of clustering techniques over microarray data: A case study

    NASA Astrophysics Data System (ADS)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  11. [THE POSSIBILITY OF APPLICATION OF COLORIMETRY TECHNIQUE OF DETECTION OF LEVELS OF OXIDATIVE STRESS AND ANTIOXIDANT CAPACITY OF SERUM].

    PubMed

    Sapojnikova, M A; Strakhova, L A; Blinova, T V; Makarov, I A; Rakhmanov, R S; Umniagina, I A

    2015-11-01

    The analysis was implemented concerning indicators of oxidative status and antioxidant capacity of serum. The indicators were received by colorimetry technique based on detection of peroxides in blood serum in examined patients of different categories: healthy persons aged from 17 to 20 years and from 30 to 60 years and patients with bronchopulmonary pathology. The low level of oxidative stress and high antioxidant capacity of serum were established in individuals ofyounger age. With increasing of age, degree of expression of oxidative stress augmented and level of antioxidant defense lowered. Almost all patients with bronchopulmonary pathology had high level of oxidative stress and low level of antioxidant defense. The analysis of quantitative data of examined indicators their conformity with health condition was established

  12. Social impact analysis: monetary valuation

    USGS Publications Warehouse

    Wainger, Lisa A.; Johnston, Robert J.; Bagstad, Kenneth J.; Casey, Frank; Vegh, Tibor

    2014-01-01

    This section provides basic guidance for using and conducting economic valuation, including criteria for judging whether valuation is appropriate for supporting decisions. It provides an introduction to the economic techniques used to measure changes in social welfare and describes which methods may be most appropriate for use in valuing particular ecosystem services. Rather than providing comprehensive valuation instructions,it directs readers to additional resources.More generally, it establishes that the valuation of ecosystem services is grounded in a long history of non-market valuation and discusses how ecosystem services valuation can be conducted within established economic theory and techniques.

  13. Generalized simulation technique for turbojet engine system analysis

    NASA Technical Reports Server (NTRS)

    Seldner, K.; Mihaloew, J. R.; Blaha, R. J.

    1972-01-01

    A nonlinear analog simulation of a turbojet engine was developed. The purpose of the study was to establish simulation techniques applicable to propulsion system dynamics and controls research. A schematic model was derived from a physical description of a J85-13 turbojet engine. Basic conservation equations were applied to each component along with their individual performance characteristics to derive a mathematical representation. The simulation was mechanized on an analog computer. The simulation was verified in both steady-state and dynamic modes by comparing analytical results with experimental data obtained from tests performed at the Lewis Research Center with a J85-13 engine. In addition, comparison was also made with performance data obtained from the engine manufacturer. The comparisons established the validity of the simulation technique.

  14. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    PubMed

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p < 0.001), whereas the effect size for combined controls was 0.41 (95% confidence interval, 0.17-0.67; p = 0.001). Emotional freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  15. Costs of genetic testing: Supporting Brazilian Public Policies for the incorporating of molecular diagnostic technologies

    PubMed Central

    Schlatter, Rosane Paixão; Matte, Ursula; Polanczyk, Carisi Anne; Koehler-Santos, Patrícia; Ashton-Prolla, Patricia

    2015-01-01

    This study identifies and describes the operating costs associated with the molecular diagnosis of diseases, such as hereditary cancer. To approximate the costs associated with these tests, data informed by Standard Operating Procedures for various techniques was collected from hospital software and a survey of market prices. Costs were established for four scenarios of capacity utilization to represent the possibility of suboptimal use in research laboratories. Cost description was based on a single site. The results show that only one technique was not impacted by rising costs due to underutilized capacity. Several common techniques were considerably more expensive at 30% capacity, including polymerase chain reaction (180%), microsatellite instability analysis (181%), gene rearrangement analysis by multiplex ligation probe amplification (412%), non-labeled sequencing (173%), and quantitation of nucleic acids (169%). These findings should be relevant for the definition of public policies and suggest that investment of public funds in the establishment of centralized diagnostic research centers would reduce costs to the Public Health System. PMID:26500437

  16. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  17. Guidance on individual monitoring programmes for radioisotopic techniques in molecular and cellular biology.

    PubMed

    Macías, M T; Navarro, T; Lavara, A; Robredo, L M; Sierra, I; Lopez, M A

    2003-01-01

    The radioisotope techniques used in molecular and cellular biology involve external and internal irradiation risk. The personal dosemeter may be a reasonable indicator for external irradiation. However, it is necessary to control the possible internal contamination associated with the development of these techniques. The aim of this project is to analyse the most usual techniques and to establish programmes of internal monitoring for specific radionuclides (32P, 35S, 14C, 3H, 125I and 131I). To elaborate these programmes it was necessary to analyse the radioisotope techniques. Two models have been applied (NRPB and IAEA) to the more significant techniques, according to the physical and chemical nature of the radionuclides, their potential importance in occupational exposure and the possible injury to the genetic material of the cell. The results allowed the identification of the techniques with possible risk of internal contamination. It was necessary to identify groups of workers that require individual monitoring. The risk groups have been established among the professionals exposed, according to different parameters: the general characteristics of receptor, the radionuclides used (the same user can work with one, two or three radionuclides at the same time) and the results of the models applied. Also a control group was established. The study of possible intakes in these groups has been made by urinalysis and whole-body counter. The theoretical results are coherent with the experimental results. They have allowed guidance to individual monitoring to be proposed. Basically, the document shows: (1) the analysis of the radiosotopic techniques, taking into account the special containment equipment; (2) the establishment of the need of individual monitoring; and (3) the required frequency of measurements in a routine programme.

  18. Comparing the reliability of a trigonometric technique to goniometry and inclinometry in measuring ankle dorsiflexion.

    PubMed

    Sidaway, Ben; Euloth, Tracey; Caron, Heather; Piskura, Matthew; Clancy, Jessica; Aide, Alyson

    2012-07-01

    The purpose of this study was to compare the reliability of three previously used techniques for the measurement of ankle dorsiflexion ROM, open-chained goniometry, closed-chained goniometry, and inclinometry, to a novel trigonometric technique. Twenty-one physiotherapy students used four techniques (open-chained goniometry, closed-chained goniometry, inclinometry, and trigonometry) to assess dorsiflexion range of motion in 24 healthy volunteers. All student raters underwent training to establish competence in the four techniques. Raters then measured dorsiflexion with a randomly assigned measuring technique four times over two sessions, one week apart. Data were analyzed using a technique by session analysis of variance, technique measurement variability being the primary index of reliability. Comparisons were also made between the measurements derived from the four techniques and those obtained from a computerized video analysis system. Analysis of the rater measurement variability around the technique means revealed significant differences between techniques with the least variation being found in the trigonometric technique. Significant differences were also found between the technique means but no differences between sessions were evident. The trigonometric technique produced mean ROMs closest in value to those derived from computer analysis. Application of the trigonometric technique resulted in the least variability in measurement across raters and consequently should be considered for use when changes in dorsiflexion ROM need to be reliably assessed. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Analysis of Ergot Alkaloids

    PubMed Central

    Crews, Colin

    2015-01-01

    The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS) and liquid chromatography with fluorescence detection (LC-FLD). Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes. PMID:26046699

  20. Establishing Evidence for Internal Structure Using Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Watson, Joshua C.

    2017-01-01

    Exploratory factor analysis (EFA) is a data reduction technique used to condense data into smaller sets of summary variables by identifying underlying factors potentially accounting for patterns of collinearity among said variables. Using an illustrative example, the 5 general steps of EFA are described with best practices for decision making…

  1. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  2. Echocardiographic Evaluation of Left Atrial Mechanics: Function, History, Novel Techniques, Advantages, and Pitfalls.

    PubMed

    Leischik, Roman; Littwitz, Henning; Dworrak, Birgit; Garg, Pankaj; Zhu, Meihua; Sahn, David J; Horlitz, Marc

    2015-01-01

    Left atrial (LA) functional analysis has an established role in assessing left ventricular diastolic function. The current standard echocardiographic parameters used to study left ventricular diastolic function include pulsed-wave Doppler mitral inflow analysis, tissue Doppler imaging measurements, and LA dimension estimation. However, the above-mentioned parameters do not directly quantify LA performance. Deformation studies using strain and strain-rate imaging to assess LA function were validated in previous research, but this technique is not currently used in routine clinical practice. This review discusses the history, importance, and pitfalls of strain technology for the analysis of LA mechanics.

  3. An Analysis of a Comprehensive Evaluation Model for Guided Group Interaction Techniques with Juvenile Delinquents. Final Report.

    ERIC Educational Resources Information Center

    Silverman, Mitchell

    Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…

  4. Further fMRI Validation of the Visual Half Field Technique as an Indicator of Language Laterality: A Large-Group Analysis

    ERIC Educational Resources Information Center

    Van der Haegen, Lise; Cai, Qing; Seurinck, Ruth; Brysbaert, Marc

    2011-01-01

    The best established lateralized cerebral function is speech production, with the majority of the population having left hemisphere dominance. An important question is how to best assess the laterality of this function. Neuroimaging techniques such as functional Magnetic Resonance Imaging (fMRI) are increasingly used in clinical settings to…

  5. Doing That Thing That Scientists Do: A Discovery-Driven Module on Protein Purification and Characterization for the Undergraduate Biochemistry Laboratory Classroom

    ERIC Educational Resources Information Center

    Garrett, Teresa A.; Osmundson, Joseph; Isaacson, Marisa; Herrera, Jennifer

    2015-01-01

    In traditional introductory biochemistry laboratory classes students learn techniques for protein purification and analysis by following provided, established, step-by-step procedures. Students are exposed to a variety of biochemical techniques but are often not developing procedures or collecting new, original data. In this laboratory module,…

  6. Thermal Modeling of the Mars Reconnaissance Orbiter's Solar Panel and Instruments during Aerobraking

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Gasbarre, Joseph F.; Amundsen, Ruth M.

    2007-01-01

    The Mars Reconnaissance Orbiter (MRO) launched on August 12, 2005 and started aerobraking at Mars in March 2006. During the spacecraft s design phase, thermal models of the solar panels and instruments were developed to determine which components would be the most limiting thermally during aerobraking. Having determined the most limiting components, thermal limits in terms of heat rate were established. Advanced thermal modeling techniques were developed utilizing Thermal Desktop and Patran Thermal. Heat transfer coefficients were calculated using a Direct Simulation Monte Carlo technique. Analysis established that the solar panels were the most limiting components during the aerobraking phase of the mission.

  7. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  8. In-situ Isotopic Analysis at Nanoscale using Parallel Ion Electron Spectrometry: A Powerful New Paradigm for Correlative Microscopy

    NASA Astrophysics Data System (ADS)

    Yedra, Lluís; Eswara, Santhana; Dowsett, David; Wirtz, Tom

    2016-06-01

    Isotopic analysis is of paramount importance across the entire gamut of scientific research. To advance the frontiers of knowledge, a technique for nanoscale isotopic analysis is indispensable. Secondary Ion Mass Spectrometry (SIMS) is a well-established technique for analyzing isotopes, but its spatial-resolution is fundamentally limited. Transmission Electron Microscopy (TEM) is a well-known method for high-resolution imaging down to the atomic scale. However, isotopic analysis in TEM is not possible. Here, we introduce a powerful new paradigm for in-situ correlative microscopy called the Parallel Ion Electron Spectrometry by synergizing SIMS with TEM. We demonstrate this technique by distinguishing lithium carbonate nanoparticles according to the isotopic label of lithium, viz. 6Li and 7Li and imaging them at high-resolution by TEM, adding a new dimension to correlative microscopy.

  9. Screen-Printed Electrodes Modified with “Green” Metals for Electrochemical Stripping Analysis of Toxic Elements

    PubMed Central

    Economou, Anastasios

    2018-01-01

    This work reviews the field of screen-printed electrodes (SPEs) modified with “green” metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of “green” metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned. PMID:29596391

  10. Screen-Printed Electrodes Modified with "Green" Metals for Electrochemical Stripping Analysis of Toxic Elements.

    PubMed

    Economou, Anastasios

    2018-03-29

    This work reviews the field of screen-printed electrodes (SPEs) modified with "green" metals for electrochemical stripping analysis of toxic elements. Electrochemical stripping analysis has been established as a useful trace analysis technique offering many advantages compared to competing optical techniques. Although mercury has been the preferred electrode material for stripping analysis, the toxicity of mercury and the associated legal requirements in its use and disposal have prompted research towards the development of "green" metals as alternative electrode materials. When combined with the screen-printing technology, such environment-friendly metals can lead to disposable sensors for trace metal analysis with excellent operational characteristics. This review focuses on SPEs modified with Au, Bi, Sb, and Sn for stripping analysis of toxic elements. Different modification approaches (electroplating, bulk modification, use of metal precursors, microengineering techniques) are considered and representative applications are described. A developing related field, namely biosensing based on stripping analysis of metallic nanoprobe labels, is also briefly mentioned.

  11. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  12. Mechanisms of Mitochondrial Defects in Gulf War Syndrome

    DTIC Science & Technology

    2014-10-01

    parameters: uncoupling ratio, net routine flux control ratio, respiratory control ratio, leak flux control ratio, phosphorylation respiratory... oxidative phosphorylation subunit) Quantitative analysis of individual mitochondrial proteins. The technique has been established and validated for muscle...Blue Native and Clear Native Analyses (non-denatured analysis of supercomplex formation and monomeric oxidative phosphorylation enzyme assembly

  13. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  14. ABC Analysis for Inventory Management: Bridging the Gap between Research and Classroom

    ERIC Educational Resources Information Center

    Ravinder, Handanhal; Misra, Ram B.

    2014-01-01

    ABC analysis is a well-established categorization technique based on the Pareto Principle for determining which items should get priority in the management of a company's inventory. In discussing this topic, today's operations management and supply chain textbooks focus on dollar volume as the sole criterion for performing the categorization. The…

  15. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  16. Determining Kinetic Parameters for Isothermal Crystallization of Glasses

    NASA Technical Reports Server (NTRS)

    Ray, C. S.; Zhang, T.; Reis, S. T.; Brow, R. K.

    2006-01-01

    Non-isothermal crystallization techniques are frequently used to determine the kinetic parameters for crystallization in glasses. These techniques are experimentally simple and quick compared to the isothermal techniques. However, the analytical models used for non-isothermal data analysis, originally developed for describing isothermal transformation kinetics, are fundamentally flawed. The present paper describes a technique for determining the kinetic parameters for isothermal crystallization in glasses, which eliminates most of the common problems that generally make the studies of isothermal crystallization laborious and time consuming. In this technique, the volume fraction of glass that is crystallized as a function of time during an isothermal hold was determined using differential thermal analysis (DTA). The crystallization parameters for the lithium-disilicate (Li2O.2SiO2) model glass were first determined and compared to the same parameters determined by other techniques to establish the accuracy and usefulness of the present technique. This technique was then used to describe the crystallization kinetics of a complex Ca-Sr-Zn-silicate glass developed for sealing solid oxide fuel cells.

  17. Using freelisting to identify, assess, and characterize age differences in shared cultural domains.

    PubMed

    Schrauf, Robert W; Sanchez, Julia

    2008-11-01

    Freelisting is a brief, paper-and-pencil technique in which participants make lists of items that they believe belong in a particular domain. Where cultural domains are shared, as for young and old in the same society, subtle intracultural differences may be difficult to detect. This article presents a series of techniques for revealing and describing this intracultural variation in freelisted data among young versus old age groups. Older (N = 30) and younger (N = 31) Mexicans in Mexico City made freelists in four quotidian domains: animals, emotions, illnesses, and gendered occupations. We used minimum residual factor analysis (consensus analysis) to establish domain coherence and assess overall consensus concerning contents of the domains. We established subvariation within the overall consensus by comparing levels of observed versus predicted inter-informant agreement. Results showed divergent patterns of inter-informant agreement between young and old participants across domains. Qualitative examination of items with higher salience for young versus old revealed age differences consistent with prior findings in each domain. The concatenation of these techniques renders freelisting an accessible, easily administered tool for probing age and group differences in cultural domains.

  18. Exploitation of ERTS-1 imagery utilizing snow enhancement techniques

    NASA Technical Reports Server (NTRS)

    Wobber, F. J.; Martin, K. R.

    1973-01-01

    Photogeological analysis of ERTS-simulation and ERTS-1 imagery of snowcovered terrain within the ERAP Feather River site and within the New England (ERTS) test area provided new fracture detail which does not appear on available geological maps. Comparative analysis of snowfree ERTS-1 images has demonstrated that MSS Bands 5 and 7 supply the greatest amount of geological fracture detail. Interpretation of the first snow-covered ERTS-1 images in correlation with ground snow depth data indicates that a heavy blanket of snow (more than 9 inches) accentuates major structural features while a light "dusting", (less than 1 inch) accentuates more subtle topographic expressions. An effective mail-based method for acquiring timely ground-truth (snowdepth) information was established and provides a ready correlation of fracture detail with snow depth so as to establish the working limits of the technique. The method is both efficient and inexpensive compared with the cost of similarly scaled direct field observations.

  19. NUMERICAL ANALYSIS TECHNIQUE USING THE STATISTICAL ENERGY ANALYSIS METHOD CONCERNING THE BLASTING NOISE REDUCTION BY THE SOUND INSULATION DOOR USED IN TUNNEL CONSTRUCTIONS

    NASA Astrophysics Data System (ADS)

    Ishida, Shigeki; Mori, Atsuo; Shinji, Masato

    The main method to reduce the blasting charge noise which occurs in a tunnel under construction is to install the sound insulation door in the tunnel. However, the numerical analysis technique to predict the accurate effect of the transmission loss in the sound insulation door is not established. In this study, we measured the blasting charge noise and the vibration of the sound insulation door in the tunnel with the blasting charge, and performed analysis and modified acoustic feature. In addition, we reproduced the noise reduction effect of the sound insulation door by statistical energy analysis method and confirmed that numerical simulation is possible by this procedure.

  20. Sensor failure and multivariable control for airbreathing propulsion systems. Ph.D. Thesis - Dec. 1979 Final Report

    NASA Technical Reports Server (NTRS)

    Behbehani, K.

    1980-01-01

    A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.

  1. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  2. Fifty years of solid-phase extraction in water analysis--historical development and overview.

    PubMed

    Liska, I

    2000-07-14

    The use of an appropriate sample handling technique is a must in an analysis of organic micropollutants in water. The efforts to use a solid phase for the recovery of analytes from a water matrix prior to their detection have a long history. Since the first experimental trials using activated carbon filters that were performed 50 years ago, solid-phase extraction (SPE) has become an established sample preparation technique. The initial experimental applications of SPE resulted in widespread use of this technique in current water analysis and also to adoption of SPE into standardized analytical methods. During the decades of its evolution, chromatographers became aware of the advantages of SPE and, despite many innovations that appeared in the last decade, new SPE developments are still expected in the future. A brief overview of 50 years of the history of the use of SPE in organic trace analysis of water is given in presented paper.

  3. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  4. Estimating the settling velocity of bioclastic sediment using common grain-size analysis techniques

    USGS Publications Warehouse

    Cuttler, Michael V. W.; Lowe, Ryan J.; Falter, James L.; Buscombe, Daniel D.

    2017-01-01

    Most techniques for estimating settling velocities of natural particles have been developed for siliciclastic sediments. Therefore, to understand how these techniques apply to bioclastic environments, measured settling velocities of bioclastic sedimentary deposits sampled from a nearshore fringing reef in Western Australia were compared with settling velocities calculated using results from several common grain-size analysis techniques (sieve, laser diffraction and image analysis) and established models. The effects of sediment density and shape were also examined using a range of density values and three different models of settling velocity. Sediment density was found to have a significant effect on calculated settling velocity, causing a range in normalized root-mean-square error of up to 28%, depending upon settling velocity model and grain-size method. Accounting for particle shape reduced errors in predicted settling velocity by 3% to 6% and removed any velocity-dependent bias, which is particularly important for the fastest settling fractions. When shape was accounted for and measured density was used, normalized root-mean-square errors were 4%, 10% and 18% for laser diffraction, sieve and image analysis, respectively. The results of this study show that established models of settling velocity that account for particle shape can be used to estimate settling velocity of irregularly shaped, sand-sized bioclastic sediments from sieve, laser diffraction, or image analysis-derived measures of grain size with a limited amount of error. Collectively, these findings will allow for grain-size data measured with different methods to be accurately converted to settling velocity for comparison. This will facilitate greater understanding of the hydraulic properties of bioclastic sediment which can help to increase our general knowledge of sediment dynamics in these environments.

  5. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.

    PubMed

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-09

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.

  6. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    PubMed Central

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-01-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454

  7. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    NASA Astrophysics Data System (ADS)

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.

  8. International Student Recruitment Techniques: A Preliminary Analysis

    ERIC Educational Resources Information Center

    Onk, Veronica Bou; Joseph, Mathew

    2017-01-01

    Around the world, these educational institutions focus their efforts on recruiting talented students, particularly from foreign countries. However, while well-established universities in developed countries can produce successful international recruitment campaigns, emerging universities still need assistance in producing a successful…

  9. A Proposed Model for the Analysis and Interpretation of Focus Groups in Evaluation Research

    ERIC Educational Resources Information Center

    Massey, Oliver T.

    2011-01-01

    Focus groups have an established history in applied research and evaluation. The fundamental methods of the focus group technique have been well discussed, as have their potential advantages. Less guidance tends to be provided regarding the analysis of data resulting from focus groups or how to organize and defend conclusions drawn from the…

  10. Potable water taste enhancement

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis was conducted to determine the causes of and remedies for the unpalatability of potable water in manned spacecraft. Criteria and specifications for palatable water were established and a quantitative laboratory analysis technique was developed for determinig the amounts of volatile organics in good tasting water. Prototype spacecraft water reclamation systems are evaluated in terms of the essential palatability factors.

  11. Something old, something new, something borrowed, something blue: a framework for the marriage of health econometrics and cost-effectiveness analysis.

    PubMed

    Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R

    2002-07-01

    Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.

  12. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  13. Forensic Applications of LIBS

    NASA Astrophysics Data System (ADS)

    Hark, Richard R.; East, Lucille J.

    Forensic science is broadly defined as the application of science to matters of the law. Practitioners typically use multidisciplinary scientific techniques for the analysis of physical evidence in an attempt to establish or exclude an association between a suspect and the scene of a crime.

  14. Managing Variation in Services in a Software Product Line Context

    DTIC Science & Technology

    2010-05-01

    Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and

  15. Analysis of Extracellular Vesicles in the Tumor Microenvironment.

    PubMed

    Al-Nedawi, Khalid; Read, Jolene

    2016-01-01

    Extracellular vesicles (ECV) are membrane compartments shed from all types of cells in various physiological and pathological states. In recent years, ECV have gained an increasing interest from the scientific community for their role as an intercellular communicator that plays important roles in modifying the tumor microenvironment. Multiple techniques have been established to collect ECV from conditioned media of cell culture or physiological fluids. The gold standard methodology is differential centrifugation. Although alternative techniques exist to collect ECV, these techniques have not proven suitable as a substitution for the ultracentrifugation procedure.

  16. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Characterization of oils and fats by 1H NMR and GC/MS fingerprinting: classification, prediction and detection of adulteration.

    PubMed

    Fang, Guihua; Goh, Jing Yeen; Tay, Manjun; Lau, Hiu Fung; Li, Sam Fong Yau

    2013-06-01

    The correct identification of oils and fats is important to consumers from both commercial and health perspectives. Proton nuclear magnetic resonance ((1)H NMR) spectroscopy, gas chromatography-mass spectrometry (GC/MS) fingerprinting and chemometrics were employed successfully for the quality control of oils and fats. Principal component analysis (PCA) of both techniques showed group clustering of 14 types of oils and fats. Partial least squares discriminant analysis (PLS-DA) and orthogonal projections to latent structures discriminant analysis (OPLS-DA) using GC/MS data had excellent classification sensitivity and specificity compared to models using NMR data. Depending on the availability of the instruments, data from either technique can effectively be applied for the establishment of an oils and fats database to identify unknown samples. Partial least squares (PLS) models were successfully established for the detection of as low as 5% of lard and beef tallow spiked into canola oil, thus illustrating possible applications in Islamic and Jewish countries. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  19. The Sixth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Sixth Annual Thermal and Fluids Analysis Workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysis. Paper topics included advances an uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  20. A Didactic Experience of Statistical Analysis for the Determination of Glycine in a Nonaqueous Medium Using ANOVA and a Computer Program

    ERIC Educational Resources Information Center

    Santos-Delgado, M. J.; Larrea-Tarruella, L.

    2004-01-01

    The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.

  1. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  2. Training Staff Serving Clients with Intellectual Disabilities: A Meta-Analysis of Aspects Determining Effectiveness

    ERIC Educational Resources Information Center

    van Oorsouw, Wietske M. W. J.; Embregts, Petri J. C. M.; Bosman, Anna M. T.; Jahoda, Andrew

    2009-01-01

    The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff…

  3. Crystal growth, structural, optical, dielectric and thermal studies of an amino acid based organic NLO material: L-Phenylalanine L-phenylalaninium malonate

    NASA Astrophysics Data System (ADS)

    Prakash, M.; Geetha, D.; Lydia Caroline, M.; Ramesh, P. S.

    2011-12-01

    Good transparent single crystals of L-phenylalanine L-phenylalaninium malonate (LPPMA) have been grown successfully by slow evaporation technique from aqueous solution. Single crystal X-ray diffractometer was utilized to measure unit cell parameter and to confirm the crystal structure. The chemical structure of compound was established by FT-NMR technique. The vibrational modes of the molecules of elucidated from FTIR spectra. Its optical behaviour has been examined by UV-vis spectral analysis, which shows the absence of absorbance in the visible region. Thermal properties of the LPPMA crystal were carried out by thermo gravimetric analysis (TGA) and differential thermal analysis (DTA) techniques, which indicate that the material does not decompose before melting. The melting point of grown crystal was observed as 180 °C by melting point apparatus. The NLO property was confirmed by the powder technique of Kurtz and Perry. The dielectric behaviour of the sample was also studied for the first time.

  4. Structuring an Internal Evaluation Process.

    ERIC Educational Resources Information Center

    Gordon, Sheila C.; Heinemann, Harry N.

    1980-01-01

    The design of an internal program evaluation system requires (1) formulation of program, operational, and institutional objectives; (2) establishment of evaluation criteria; (3) choice of data collection and evaluation techniques; (4) analysis of results; and (5) integration of the system into the mainstream of operations. (SK)

  5. A bibliometric analysis of the 50 most cited papers in cleft lip and palate.

    PubMed

    Mahon, Nicola A; Joyce, Cormac W

    2015-02-01

    Citation analysis is an established bibliometric method which catalogues papers according to the number of times they have been referenced. It is believed that the total number of citations an article receives reflects its importance among its peers. Never before has a bibliometric analysis been performed in the area of Cleft Lip and Palate. Our citation analysis creates a comprehensive list of the 50 most influential papers in this field. Journals specializing in Cleft Palate, Craniofacial, Plastic Surgery, Maxillofacial Surgery, Aesthetics and Radiology were searched to establish which articles most enriched the specialty over the past 70 years. The results show an interesting collection of papers which reveal developing trends in surgical techniques. These landmark papers mould and influence management and decision-making today.

  6. [Three-dimensional finite element analysis of three conjunctive methods of free iliac bone graft for established mandibular body defects].

    PubMed

    Wang, Dong; Yang, Zhuang-qun; Hu, Xiao-yi

    2007-08-01

    To analyze the stress and displacement distribution of 3D-FE models in three conjunctive methods of vascularized iliac bone graft for established mandibular body defects. Using computer image process technique, a series of spiral CT images were put into Ansys preprocess programe to establish three 3D-FE models of different conjunctions. The three 3D-FE models of established mandibular body defects by vascularized iliac bone graft were built up. The distribution of Von Mises stress and displacement around mandibular segment, grafted ilium, plates and screws was obtained. It may be determined successfully that the optimal conjunctive shape be the on-lay conjunction.

  7. HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.

    PubMed

    Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua

    2014-03-01

    Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.

  8. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  9. Automated quantification of the synchrogram by recurrence plot analysis.

    PubMed

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  10. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  11. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    NASA Astrophysics Data System (ADS)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  12. 20170913 - Systematic Approaches to Biological/Chemical Read-Across for Hazard Identification (EMGS)

    EPA Science Inventory

    Read-across is a well-established data gap filling technique used within chemical category and analogue approaches for regulatory purposes. The category/analogue workflow comprises a number of steps starting from decision context, data gap analysis through to analogue identificat...

  13. A volumetric conformal mapping approach for clustering white matter fibers in the brain

    PubMed Central

    Gupta, Vikash; Prasad, Gautam; Thompson, Paul

    2017-01-01

    The human brain may be considered as a genus-0 shape, topologically equivalent to a sphere. Various methods have been used in the past to transform the brain surface to that of a sphere using harmonic energy minimization methods used for cortical surface matching. However, very few methods have studied volumetric parameterization of the brain using a spherical embedding. Volumetric parameterization is typically used for complicated geometric problems like shape matching, morphing and isogeometric analysis. Using conformal mapping techniques, we can establish a bijective mapping between the brain and the topologically equivalent sphere. Our hypothesis is that shape analysis problems are simplified when the shape is defined in an intrinsic coordinate system. Our goal is to establish such a coordinate system for the brain. The efficacy of the method is demonstrated with a white matter clustering problem. Initial results show promise for future investigation in these parameterization technique and its application to other problems related to computational anatomy like registration and segmentation. PMID:29177252

  14. Analysis and Preliminary Design of an Advanced Technology Transport Flight Control System

    NASA Technical Reports Server (NTRS)

    Frazzini, R.; Vaughn, D.

    1975-01-01

    The analysis and preliminary design of an advanced technology transport aircraft flight control system using avionics and flight control concepts appropriate to the 1980-1985 time period are discussed. Specifically, the techniques and requirements of the flight control system were established, a number of candidate configurations were defined, and an evaluation of these configurations was performed to establish a recommended approach. Candidate configurations based on redundant integration of various sensor types, computational methods, servo actuator arrangements and data-transfer techniques were defined to the functional module and piece-part level. Life-cycle costs, for the flight control configurations, as determined in an operational environment model for 200 aircraft over a 15-year service life, were the basis of the optimum configuration selection tradeoff. The recommended system concept is a quad digital computer configuration utilizing a small microprocessor for input/output control, a hexad skewed set of conventional sensors for body rate and body acceleration, and triple integrated actuators.

  15. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  16. Laser power conversion system analysis, volume 1

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-orbit laser energy conversion system analysis established a mission model of satellites with various orbital parameters and average electrical power requirements ranging from 1 to 300 kW. The system analysis evaluated various conversion techniques, power system deployment parameters, power system electrical supplies and other critical supplies and other critical subsystems relative to various combinations of the mission model. The analysis show that the laser power system would not be competitive with current satellite power systems from weight, cost and development risk standpoints.

  17. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  18. Using foreground/background analysis to determine leaf and canopy chemistry

    NASA Technical Reports Server (NTRS)

    Pinzon, J. E.; Ustin, S. L.; Hart, Q. J.; Jacquemoud, S.; Smith, M. O.

    1995-01-01

    Spectral Mixture Analysis (SMA) has become a well established procedure for analyzing imaging spectrometry data, however, the technique is relatively insensitive to minor sources of spectral variation (e.g., discriminating stressed from unstressed vegetation and variations in canopy chemistry). Other statistical approaches have been tried e.g., stepwise multiple linear regression analysis to predict canopy chemistry. Grossman et al. reported that SMLR is sensitive to measurement error and that the prediction of minor chemical components are not independent of patterns observed in more dominant spectral components like water. Further, they observed that the relationships were strongly dependent on the mode of expressing reflectance (R, -log R) and whether chemistry was expressed on a weight (g/g) or are basis (g/sq m). Thus, alternative multivariate techniques need to be examined. Smith et al. reported a revised SMA that they termed Foreground/Background Analysis (FBA) that permits directing the analysis along any axis of variance by identifying vectors through the n-dimensional spectral volume orthonormal to each other. Here, we report an application of the FBA technique for the detection of canopy chemistry using a modified form of the analysis.

  19. Recent Development in Optical Chemical Sensors Coupling with Flow Injection Analysis

    PubMed Central

    Ojeda, Catalina Bosch; Rojas, Fuensanta Sánchez

    2006-01-01

    Optical techniques for chemical analysis are well established and sensors based on these techniques are now attracting considerable attention because of their importance in applications such as environmental monitoring, biomedical sensing, and industrial process control. On the other hand, flow injection analysis (FIA) is advisable for the rapid analysis of microliter volume samples and can be interfaced directly to the chemical process. The FIA has become a widespread automatic analytical method for more reasons; mainly due to the simplicity and low cost of the setups, their versatility, and ease of assembling. In this paper, an overview of flow injection determinations by using optical chemical sensors is provided, and instrumentation, sensor design, and applications are discussed. This work summarizes the most relevant manuscripts from 1980 to date referred to analysis using optical chemical sensors in FIA.

  20. Acoustic emission and nondestructive evaluation of biomaterials and tissues.

    PubMed

    Kohn, D H

    1995-01-01

    Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.

  1. Design study for a high reliability five-year spacecraft tape transport

    NASA Technical Reports Server (NTRS)

    Benn, G. S. L.; Eshleman, R. L.

    1971-01-01

    Following the establishment of the overall transport concept, a study of all of the life limiting constraints associated with the transport were analyzed using modeling techniques. These design techniques included: (1) a response analysis from which the performance of the transport could be determined under operating conditions for a variety of conceptual variations both in a new and aged condition; (2) an analysis of a double cone guidance technique which yielded an optimum design for maximum guidance with minimum tape degradation; (3) an analysis of the tape pack design to eliminate spoking caused by negative tangential stress within the pack; (4) an evaluation of the stress levels experienced by the magnetic tape throughout the system; (5) a general review of the bearing and lubrication technology as applied to satellite recorders and hence the recommendation for using standard load carrying antifriction ball bearings; and (6) a kinetic analysis to determine the change in kinetic properties of the transport during operation.

  2. Image analysis technique as a tool to identify morphological changes in Trametes versicolor pellets according to exopolysaccharide or laccase production.

    PubMed

    Tavares, Ana P M; Silva, Rui P; Amaral, António L; Ferreira, Eugénio C; Xavier, Ana M R B

    2014-02-01

    Image analysis technique was applied to identify morphological changes of pellets from white-rot fungus Trametes versicolor on agitated submerged cultures during the production of exopolysaccharide (EPS) or ligninolytic enzymes. Batch tests with four different experimental conditions were carried out. Two different culture media were used, namely yeast medium or Trametes defined medium and the addition of lignolytic inducers as xylidine or pulp and paper industrial effluent were evaluated. Laccase activity, EPS production, and final biomass contents were determined for batch assays and the pellets morphology was assessed by image analysis techniques. The obtained data allowed establishing the choice of the metabolic pathways according to the experimental conditions, either for laccase enzymatic production in the Trametes defined medium, or for EPS production in the rich Yeast Medium experiments. Furthermore, the image processing and analysis methodology allowed for a better comprehension of the physiological phenomena with respect to the corresponding pellets morphological stages.

  3. Formal methods for modeling and analysis of hybrid systems

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)

    2009-01-01

    A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.

  4. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    NASA Astrophysics Data System (ADS)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  5. Pattern detection in forensic case data using graph theory: application to heroin cutting agents.

    PubMed

    Terrettaz-Zufferey, Anne-Laure; Ratle, Frédéric; Ribaux, Olivier; Esseiva, Pierre; Kanevski, Mikhail

    2007-04-11

    Pattern recognition techniques can be very useful in forensic sciences to point out to relevant sets of events and potentially encourage an intelligence-led style of policing. In this study, these techniques have been applied to categorical data corresponding to cutting agents found in heroin seizures. An application of graph theoretic methods has been performed, in order to highlight the possible relationships between the location of seizures and co-occurrences of particular heroin cutting agents. An analysis of the co-occurrences to establish several main combinations has been done. Results illustrate the practical potential of mathematical models in forensic data analysis.

  6. Structural reliability assessment of the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Sharif, A.M.; Preston, R.

    1996-12-31

    Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less

  7. Light stable isotope analysis of meteorites by ion microprobe

    NASA Technical Reports Server (NTRS)

    Mcsween, Harry Y., Jr.

    1994-01-01

    The main goal was to develop the necessary secondary ion mass spectrometer (SIMS) techniques to use a Cameca ims-4f ion microprobe to measure light stable isotope ratios (H, C, O and S) in situ and in non-conducting mineral phases. The intended application of these techniques was the analysis of meteorite samples, although the techniques that have been developed are equally applicable to the investigation of terrestrial samples. The first year established techniques for the analysis of O isotope ratios (delta O-18 and delta O-17) in conducting mineral phases and the measurement of S isotope ratios (delta S-34) in a variety of sulphide phases. In addition, a technique was developed to measure delta S-34 values in sulphates, which are insulators. Other research undertaken in the first year resulted in SIMS techniques for the measurement of wide variety of trace elements in carbonate minerals, with the aim of understanding the nature of alteration fluids in carbonaceous chondrites. In the second year we developed techniques for analyzing O isotope ratios in nonconducting mineral phases. These methods are potentially applicable to the measurement of other light stable isotopes such as H, C and S in insulators. Also, we have further explored the analytical techniques used for the analysis of S isotopes in sulphides by analyzing troilite in a number of L and H ordinary chondrites. This was done to see if there was any systematic differences with petrological type.

  8. [Comparative GC analysis of essential oil in imported sandalwood].

    PubMed

    Wang, Z; Hong, X

    1991-01-01

    The GC-fingerprint spectra of essential oils in imported sandalwood are established by the new technique of GC-relative retention value fingerprint spectrum (GC-FPS). According to the GC-FPS of samples, their chromatographic peaks, overlap ratio of peaks and eight strong peaks are studied comparatively.

  9. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  10. High-accuracy peak picking of proteomics data using wavelet techniques.

    PubMed

    Lange, Eva; Gröpl, Clemens; Reinert, Knut; Kohlbacher, Oliver; Hildebrandt, Andreas

    2006-01-01

    A new peak picking algorithm for the analysis of mass spectrometric (MS) data is presented. It is independent of the underlying machine or ionization method, and is able to resolve highly convoluted and asymmetric signals. The method uses the multiscale nature of spectrometric data by first detecting the mass peaks in the wavelet-transformed signal before a given asymmetric peak function is fitted to the raw data. In an optional third stage, the resulting fit can be further improved using techniques from nonlinear optimization. In contrast to currently established techniques (e.g. SNAP, Apex) our algorithm is able to separate overlapping peaks of multiply charged peptides in ESI-MS data of low resolution. Its improved accuracy with respect to peak positions makes it a valuable preprocessing method for MS-based identification and quantification experiments. The method has been validated on a number of different annotated test cases, where it compares favorably in both runtime and accuracy with currently established techniques. An implementation of the algorithm is freely available in our open source framework OpenMS.

  11. Optimization of the tungsten oxide technique for measurement of atmospheric ammonia

    NASA Technical Reports Server (NTRS)

    Brown, Kenneth G.

    1987-01-01

    Hollow tubes coated with tungstic acid have been shown to be of value in the determination of ammonia and nitric acid in ambient air. Practical application of this technique was demonstrated utilizing an automated sampling system for in-flight collection and analysis of atmospheric samples. Due to time constraints these previous measurements were performed on tubes that had not been well characterized in the laboratory. As a result the experimental precision could not be accurately estimated. Since the technique was being compared to other techniques for measuring these compounds, it became necessary to perform laboratory tests which would establish the reliability of the technique. This report is a summary of these laboratory experiments as they are applied to the determination of ambient ammonia concentration.

  12. Propagation and Establishment of Native Plants for Vegetative Restoration of Aquatic Ecosystems

    DTIC Science & Technology

    2013-06-01

    diverse native plant communities in aquatic systems. We document the successful application of these techniques in a number of aquatic ecosystems...Aquatic Plant Control Research Program (APCRP) for establishing native aquatic plants in reservoirs and other water bodies. These techniques should...that some control techniques may negatively affect efforts to establish native vegetation. Establishing native aquatic vegetation is not an exact

  13. Establishment of a stable transfection system for genetic manipulation of Babesia gibsoni.

    PubMed

    Liu, Mingming; Adjou Moumouni, Paul Franck; Asada, Masahito; Hakimi, Hassan; Masatani, Tatsunori; Vudriko, Patrick; Lee, Seung-Hun; Kawazu, Shin-Ichiro; Yamagishi, Junya; Xuan, Xuenan

    2018-04-23

    Genetic manipulation techniques, such as transfection, have been previously reported in many protozoan parasites. In Babesia, stable transfection systems have only been established for bovine Babesia parasites. We recently reported a transient transfection system and the selection of promoter candidates for Babesia gibsoni. The establishment of a stable transfection system for B. gibsoni is considered to be urgent to improve our understanding of the basic biology of canine Babesia parasites for a better control of babesiosis. GFP-expressing parasites were observed by fluorescence microscopy as early as two weeks after drug selection, and consistently expressed GFP for more than 3 months without drug pressure. Genome integration was confirmed by PCR, sequencing and Southern blot analysis. We present the first successful establishment of a stable transfection system for B. gibsoni. This finding will facilitate functional analysis of Babesia genomes using genetic manipulation and will serve as a foundation for the development of tick-Babesia and host-Babesia infection models.

  14. [Atomic absorption fingerprint and identification studies of Da Huo Luo pill. I. Exploration of inorganic elements fingerprint for establishment of industrial standard].

    PubMed

    Zhang, Qi-Feng; Zhu, Long-Yin; Ding, Shu-Liang; Wang, Chen; Tu, Long-Fei

    2008-03-01

    The fingerprints for most of Chinese medicines based on their organic compositions have been well established. Nevertheless, there are very few known fingerprints which are based on inorganic elements. In order to identify the Da Huo Luo Dan and its efficiency from other Chinese medicines, the authors attempted to set up a fingerprint which could be determined by the measurement of inorganic elements in Da Huo Luo Dan and other Chinese medicines. In the present study, the authors first employed 28 batches of Da Huo Luo Dan produced by Zhang-Shu Pharmatheutical Company in Jiang Xi Province to screen 12 kinds of inorganic elements measured by atomic absorption spectrophotometer and established the atomic absorption fingerprints. Secondly, the authors tried to identify Da Huo Luo Dan and other Chinese medicines by using the similarly analysis of vectors and the statistical analysis of compositional data. The result showed that the methods the authors used here were predictable to tell the efficiency of Da Huo Luo Dan from others. The authors' study also proves that establishment of standard for quality control by analysis of inorganic elements in Chinese medicines is feasible. The present study provides a new idea and a new technique that serve for the establishment of industrial standards for analysis of inorganic elements fingerprint to explore the effects of Chinese medicines.

  15. Waveguide design, modeling, and optimization: from photonic nanodevices to integrated photonic circuits

    NASA Astrophysics Data System (ADS)

    Bordovsky, Michal; Catrysse, Peter; Dods, Steven; Freitas, Marcio; Klein, Jackson; Kotacka, Libor; Tzolov, Velko; Uzunov, Ivan M.; Zhang, Jiazong

    2004-05-01

    We present the state of the art for commercial design and simulation software in the 'front end' of photonic circuit design. One recent advance is to extend the flexibility of the software by using more than one numerical technique on the same optical circuit. There are a number of popular and proven techniques for analysis of photonic devices. Examples of these techniques include the Beam Propagation Method (BPM), the Coupled Mode Theory (CMT), and the Finite Difference Time Domain (FDTD) method. For larger photonic circuits, it may not be practical to analyze the whole circuit by any one of these methods alone, but often some smaller part of the circuit lends itself to at least one of these standard techniques. Later the whole problem can be analyzed on a unified platform. This kind of approach can enable analysis for cases that would otherwise be cumbersome, or even impossible. We demonstrate solutions for more complex structures ranging from the sub-component layout, through the entire device characterization, to the mask layout and its editing. We also present recent advances in the above well established techniques. This includes the analysis of nano-particles, metals, and non-linear materials by FDTD, photonic crystal design and analysis, and improved models for high concentration Er/Yb co-doped glass waveguide amplifiers.

  16. An incremental economic analysis of establishing early successional habitat for biodiversity

    Treesearch

    Slayton W. Hazard-Daniel; Patrick Hiesl; Susan C. Loeb; Thomas J. Straka

    2017-01-01

    Early successional habitat (ESH) is an important component of natural landscapes and is crucial to maintaining biodiversity. ESH also impacts endangered species. The extent of forest disturbances resulting in ESH has been diminishing, and foresters have developed timber management regimes using standard silvicultural techniques that...

  17. Fractal and Multifractal Models Applied to Porous Media - Editorial

    USDA-ARS?s Scientific Manuscript database

    Given the current high level of interest in the use of fractal geometry to characterize natural porous media, a special issue of the Vadose Zone Journal was organized in order to expose established fractal analysis techniques and cutting-edge new developments to a wider Earth science audience. The ...

  18. AN ALTERNATIVE METHOD FOR ESTABLISHING TEFS FOR DIOXIN-LIKE COMPOUNDS. PART 1. EVALUATION OF DECISION ANALYSIS METHODS FOR USE IN WEIGHTING RELATIVE POTENCY DATA

    EPA Science Inventory

    A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...

  19. From the ground up: aligning state freight plans to enhance state collaboration and establish regional and national harmonization of freight priorities.

    DOT National Transportation Integrated Search

    2016-08-01

    This project reviews MAFC state freight plans and current planning efforts and provides a catalogue of state practices, data and analysis techniques, stakeholder involvement and other planning elements. The project also identifies where states share ...

  20. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  1. Study of advanced techniques for determining the long-term performance of components

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.

  2. An evaluation of the use of near infrared (NIR) spectroscopy to identify water and oil-borne preservatives

    Treesearch

    Chi-Leung So; Stan T. Lebow; Leslie H. Groom; Todd F. Shupe

    2003-01-01

    In this research we experimented with a new and rapid way of analyzing wood. Near Infrared (NIR)spectroscopy together with multivariate analysis is becoming a widely used technique in the field of forest products especially for property determination and is already firmly established in the pulp and paper industry. This method is ideal for the chemical analysis of wood...

  3. Using time-frequency analysis to determine time-resolved detonation velocity with microwave interferometry.

    PubMed

    Kittell, David E; Mares, Jesus O; Son, Steven F

    2015-04-01

    Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.

  4. Characterization of emission microscopy and liquid crystal thermography in IC fault localization

    NASA Astrophysics Data System (ADS)

    Lau, C. K.; Sim, K. S.

    2013-05-01

    This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.

  5. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  6. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  7. Neutron spectrometry for UF 6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  8. Crystal growth, structural, optical, dielectric and thermal studies of an amino acid based organic NLO material: L-phenylalanine L-phenylalaninium malonate.

    PubMed

    Prakash, M; Geetha, D; Caroline, M Lydia; Ramesh, P S

    2011-12-01

    Good transparent single crystals of L-phenylalanine L-phenylalaninium malonate (LPPMA) have been grown successfully by slow evaporation technique from aqueous solution. Single crystal X-ray diffractometer was utilized to measure unit cell parameter and to confirm the crystal structure. The chemical structure of compound was established by FT-NMR technique. The vibrational modes of the molecules of elucidated from FTIR spectra. Its optical behaviour has been examined by UV-vis spectral analysis, which shows the absence of absorbance in the visible region. Thermal properties of the LPPMA crystal were carried out by thermo gravimetric analysis (TGA) and differential thermal analysis (DTA) techniques, which indicate that the material does not decompose before melting. The melting point of grown crystal was observed as 180°C by melting point apparatus. The NLO property was confirmed by the powder technique of Kurtz and Perry. The dielectric behaviour of the sample was also studied for the first time. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. A homotopy analysis method for the nonlinear partial differential equations arising in engineering

    NASA Astrophysics Data System (ADS)

    Hariharan, G.

    2017-05-01

    In this article, we have established the homotopy analysis method (HAM) for solving a few partial differential equations arising in engineering. This technique provides the solutions in rapid convergence series with computable terms for the problems with high degree of nonlinear terms appearing in the governing differential equations. The convergence analysis of the proposed method is also discussed. Finally, we have given some illustrative examples to demonstrate the validity and applicability of the proposed method.

  10. Simulation of mixture microstructures via particle packing models and their direct comparison with real mixtures

    NASA Astrophysics Data System (ADS)

    Gulliver, Eric A.

    The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.

  11. Experimental and data analysis techniques for deducing collision-induced forces from photographic histories of engine rotor fragment impact/interaction with a containment ring

    NASA Technical Reports Server (NTRS)

    Yeghiayan, R. P.; Leech, J. W.; Witmer, E. A.

    1973-01-01

    An analysis method termed TEJ-JET is described whereby measured transient elastic and inelastic deformations of an engine-rotor fragment-impacted structural ring are analyzed to deduce the transient external forces experienced by that ring as a result of fragment impact and interaction with the ring. Although the theoretical feasibility of the TEJ-JET concept was established, its practical feasibility when utilizing experimental measurements of limited precision and accuracy remains to be established. The experimental equipment and the techniques (high-speed motion photography) employed to measure the transient deformations of fragment-impacted rings are described. Sources of error and data uncertainties are identified. Techniques employed to reduce data reading uncertainties and to correct the data for optical-distortion effects are discussed. These procedures, including spatial smoothing of the deformed ring shape by Fourier series and timewise smoothing by Gram polynomials, are applied illustratively to recent measurements involving the impact of a single T58 turbine rotor blade against an aluminum containment ring. Plausible predictions of the fragment-ring impact/interaction forces are obtained by one branch of this TEJ-JET method; however, a second branch of this method, which provides an independent estimate of these forces, remains to be evaluated.

  12. Bridging the gap between high and low acceleration for planetary escape

    NASA Astrophysics Data System (ADS)

    Indrikis, Janis; Preble, Jeffrey C.

    With the exception of the often time consuming analysis by numerical optimization, no single orbit transfer analysis technique exists that can be applied over a wide range of accelerations. Using the simple planetary escape (parabolic trajectory) mission some of the more common techniques are considered as the limiting bastions at the high and the extremely low acceleration regimes. The brachistochrone, the minimum time of flight path, is proposed as the technique to bridge the gap between the high and low acceleration regions, providing a smooth bridge over the entire acceleration spectrum. A smooth and continuous velocity requirement is established for the planetary escape mission. By using these results, it becomes possible to determine the effect of finite accelerations on mission performance and target propulsion and power system designs which are consistent with a desired mission objective.

  13. Four lateral mass screw fixation techniques in lower cervical spine following laminectomy: a finite element analysis study of stress distribution.

    PubMed

    Song, Mingzhi; Zhang, Zhen; Lu, Ming; Zong, Junwei; Dong, Chao; Ma, Kai; Wang, Shouyu

    2014-08-09

    Lateral mass screw fixation (LSF) techniques have been widely used for reconstructing and stabilizing the cervical spine; however, complications may result depending on the choice of surgeon. There are only a few reports related to LSF applications, even though fracture fixation has become a severe complication. This study establishes the three-dimensional finite element model of the lower cervical spine, and compares the stress distribution of the four LSF techniques (Magerl, Roy-Camille, Anderson, and An), following laminectomy -- to explore the risks of rupture after fixation. CT scans were performed on a healthy adult female volunteer, and Digital imaging and communication in medicine (Dicom) data was obtained. Mimics 10.01, Geomagic Studio 12.0, Solidworks 2012, HyperMesh 10.1 and Abaqus 6.12 software programs were used to establish the intact model of the lower cervical spines (C3-C7), a postoperative model after laminectomy, and a reconstructive model after applying the LSF techniques. A compressive preload of 74 N combined with a pure moment of 1.8 Nm was applied to the intact and reconstructive model, simulating normal flexion, extension, lateral bending, and axial rotation. The stress distribution of the four LSF techniques was compared by analyzing the maximum von Mises stress. The three-dimensional finite element model of the intact C3-C7 vertebrae was successfully established. This model consists of 503,911 elements and 93,390 nodes. During flexion, extension, lateral bending, and axial rotation modes, the intact model's angular intersegmental range of motion was in good agreement with the results reported from the literature. The postoperative model after the three-segment laminectomy and the reconstructive model after applying the four LSF techniques were established based on the validated intact model. The stress distribution for the Magerl and Roy-Camille groups were more dispersive, and the maximum von Mises stress levels were lower than the other two groups in various conditions. The LSF techniques of Magerl and Roy-Camille are safer methods for stabilizing the lower cervical spine. Therefore, these methods potentially have a lower risk of fixation fracture.

  14. Four lateral mass screw fixation techniques in lower cervical spine following laminectomy: a finite element analysis study of stress distribution

    PubMed Central

    2014-01-01

    Background Lateral mass screw fixation (LSF) techniques have been widely used for reconstructing and stabilizing the cervical spine; however, complications may result depending on the choice of surgeon. There are only a few reports related to LSF applications, even though fracture fixation has become a severe complication. This study establishes the three-dimensional finite element model of the lower cervical spine, and compares the stress distribution of the four LSF techniques (Magerl, Roy-Camille, Anderson, and An), following laminectomy -- to explore the risks of rupture after fixation. Method CT scans were performed on a healthy adult female volunteer, and Digital imaging and communication in medicine (Dicom) data was obtained. Mimics 10.01, Geomagic Studio 12.0, Solidworks 2012, HyperMesh 10.1 and Abaqus 6.12 software programs were used to establish the intact model of the lower cervical spines (C3-C7), a postoperative model after laminectomy, and a reconstructive model after applying the LSF techniques. A compressive preload of 74 N combined with a pure moment of 1.8 Nm was applied to the intact and reconstructive model, simulating normal flexion, extension, lateral bending, and axial rotation. The stress distribution of the four LSF techniques was compared by analyzing the maximum von Mises stress. Result The three-dimensional finite element model of the intact C3-C7 vertebrae was successfully established. This model consists of 503,911 elements and 93,390 nodes. During flexion, extension, lateral bending, and axial rotation modes, the intact model’s angular intersegmental range of motion was in good agreement with the results reported from the literature. The postoperative model after the three-segment laminectomy and the reconstructive model after applying the four LSF techniques were established based on the validated intact model. The stress distribution for the Magerl and Roy-Camille groups were more dispersive, and the maximum von Mises stress levels were lower than the other two groups in various conditions. Conclusion The LSF techniques of Magerl and Roy-Camille are safer methods for stabilizing the lower cervical spine. Therefore, these methods potentially have a lower risk of fixation fracture. PMID:25106498

  15. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    NASA Astrophysics Data System (ADS)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  16. Retrieval of complex χ(2) parts for quantitative analysis of sum-frequency generation intensity spectra

    PubMed Central

    Hofmann, Matthias J.; Koelsch, Patrick

    2015-01-01

    Vibrational sum-frequency generation (SFG) spectroscopy has become an established technique for in situ surface analysis. While spectral recording procedures and hardware have been optimized, unique data analysis routines have yet to be established. The SFG intensity is related to probing geometries and properties of the system under investigation such as the absolute square of the second-order susceptibility χ(2)2. A conventional SFG intensity measurement does not grant access to the complex parts of χ(2) unless further assumptions have been made. It is therefore difficult, sometimes impossible, to establish a unique fitting solution for SFG intensity spectra. Recently, interferometric phase-sensitive SFG or heterodyne detection methods have been introduced to measure real and imaginary parts of χ(2) experimentally. Here, we demonstrate that iterative phase-matching between complex spectra retrieved from maximum entropy method analysis and fitting of intensity SFG spectra (iMEMfit) leads to a unique solution for the complex parts of χ(2) and enables quantitative analysis of SFG intensity spectra. A comparison between complex parts retrieved by iMEMfit applied to intensity spectra and phase sensitive experimental data shows excellent agreement between the two methods. PMID:26450297

  17. Image processing and analysis using neural networks for optometry area

    NASA Astrophysics Data System (ADS)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  18. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  19. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  20. Direct and ultrasonic measurements of macroscopic piezoelectricity in sintered hydroxyapatite

    NASA Astrophysics Data System (ADS)

    Tofail, S. A. M.; Haverty, D.; Cox, F.; Erhart, J.; Hána, P.; Ryzhenko, V.

    2009-03-01

    Macroscopic piezoelectricity in hydroxyapatite (HA) ceramic was measured by a direct quasistatic method and an ultrasonic interference technique. The effective symmetry of polycrystalline aggregate was established and a detailed theoretical analysis was carried out to determine by these two methods the shear piezoelectric coefficient, d14, of HA. Piezoelectric nature of HA was proved qualitatively although a specific quantitative value for the d14 coefficient could not be established. Ultrasound method was also employed to anisotropic elastic constants, which agreed well with those measured from the first principles.

  1. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oesterling, Patrick; Heine, Christian; Weber, Gunther H.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less

  2. Estimating regional greenhouse gas fluxes: An uncertainty analysis of planetary boundary layer techniques and bottom-up inventories

    USDA-ARS?s Scientific Manuscript database

    Quantification of regional greenhouse gas (GHG) fluxes is essential for establishing mitigation strategies and evaluating their effectiveness. Here, we used multiple top-down approaches and multiple trace gas observations at a tall tower to estimate GHG regional fluxes and evaluate the GHG fluxes de...

  3. Interviewing a Silent (Radioactive) Witness through Nuclear Forensic Analysis.

    PubMed

    Mayer, Klaus; Wallenius, Maria; Varga, Zsolt

    2015-12-01

    Nuclear forensics is a relatively young discipline in science which aims at providing information on nuclear material of unknown origin. The determination of characteristic parameters through tailored analytical techniques enables establishing linkages to the material's processing history and hence provides hints on its place and date of production and on the intended use.

  4. Using the 16MM. Stop-Frame Projector to Teach Film Technique.

    ERIC Educational Resources Information Center

    Head, James

    1969-01-01

    English is concerned with language experience, and because much of today's "language" is experienced through electronic media--television, movies, radio--film courses fall within the English curriculum. A stop-frame projector is essential for classroom analysis of such film devices as framing, establishing shots, and scene composition. Framing is…

  5. The radiographic investigation of two Egyptian mummies.

    PubMed

    Fodor, J; Malott, J C; King, A Y

    1983-01-01

    Radiography is a well-recognized method of nondestructive analysis of art objects and ancient relics. The methods and techniques used in the examination of two ancient Egyptian mummies are presented here. Additionally, the use of radiographic findings to help substantiate alleged historical information and to establish sex, age, and pathology of each specimen is discussed.

  6. A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines

    PubMed Central

    Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua

    2018-01-01

    The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905

  7. Application of Bayesian Approach in Cancer Clinical Trial

    PubMed Central

    Bhattacharjee, Atanu

    2014-01-01

    The application of Bayesian approach in clinical trials becomes more useful over classical method. It is beneficial from design to analysis phase. The straight forward statement is possible to obtain through Bayesian about the drug treatment effect. Complex computational problems are simple to handle with Bayesian techniques. The technique is only feasible to performing presence of prior information of the data. The inference is possible to establish through posterior estimates. However, some limitations are present in this method. The objective of this work was to explore the several merits and demerits of Bayesian approach in cancer research. The review of the technique will be helpful for the clinical researcher involved in the oncology to explore the limitation and power of Bayesian techniques. PMID:29147387

  8. Novel casting processes for single-crystal turbine blades of superalloys

    NASA Astrophysics Data System (ADS)

    Ma, Dexin

    2018-03-01

    This paper presents a brief review of the current casting techniques for single-crystal (SC) blades, as well as an analysis of the solidification process in complex turbine blades. A series of novel casting methods based on the Bridgman process were presented to illustrate the development in the production of SC blades from superalloys. The grain continuator and the heat conductor techniques were developed to remove geometry-related grain defects. In these techniques, the heat barrier that hinders lateral SC growth from the blade airfoil into the extremities of the platform is minimized. The parallel heating and cooling system was developed to achieve symmetric thermal conditions for SC solidification in blade clusters, thus considerably decreasing the negative shadow effect and its related defects in the current Bridgman process. The dipping and heaving technique, in which thinshell molds are utilized, was developed to enable the establishment of a high temperature gradient for SC growth and the freckle-free solidification of superalloy castings. Moreover, by applying the targeted cooling and heating technique, a novel concept for the three-dimensional and precise control of SC growth, a proper thermal arrangement may be dynamically established for the microscopic control of SC growth in the critical areas of large industrial gas turbine blades.

  9. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  11. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1979-01-01

    Work towards establishing a vector wind profile gust model for the Space Transportation System flight operations and trade studies is reported. To date, all the statistical and computational techniques required were established and partially implemented. An analysis of wind profile gust at Cape Kennedy within the theoretical framework is presented. The variability of theoretical and observed gust magnitude with filter type, altitude, and season is described. Various examples are presented which illustrate agreement between theoretical and observed gust percentiles. The preliminary analysis of the gust data indicates a strong variability with altitude, season, and wavelength regime. An extension of the analyses to include conditional distributions of gust magnitude given gust length, distributions of gust modulus, and phase differences between gust components has begun.

  12. Photoacoustic imaging of angiogenesis in a subcutaneous islet transplant site in a murine model

    NASA Astrophysics Data System (ADS)

    Shi, Wei; Pawlick, Rena; Bruni, Antonio; Rafiei, Yasmin; Pepper, Andrew R.; Gala-Lopez, Boris; Choi, Min; Malcolm, Andrew; Zemp, Roger J.; Shapiro, A. M. James

    2016-06-01

    Islet transplantation (IT) is an established clinical therapy for select patients with type-1 diabetes. Clinically, the hepatic portal vein serves as the site for IT. Despite numerous advances in clinical IT, limitations remain, including early islet cell loss posttransplant, procedural complications, and the inability to effectively monitor islet grafts. Hence, alternative sites for IT are currently being explored, with the subcutaneous space as one potential option. When left unmodified, the subcutaneous space routinely fails to promote successful islet engraftment. However, when employing the previously developed subcutaneous "deviceless" technique, a favorable microenvironment for islet survival and function is established. In this technique, an angiocatheter was temporarily implanted subcutaneously, which facilitated angiogenesis to promote subsequent islet engraftment. This technique has been employed in preclinical animal models, providing a sufficient means to develop techniques to monitor functional aspects of the graft such as angiogenesis. Here, we utilize photoacoustic imaging to track angiogenesis during the priming of the subcutaneous site by the implanted catheter at 1 to 4 weeks postcatheter. Quantitative analysis on vessel densities shows gradual growth of vasculature in the implant position. These results demonstrate the ability to track angiogenesis, thus facilitating a means to optimize and assess the pretransplant microenvironment.

  13. Establishment of a protocol for the gene expression analysis of laser microdissected rat kidney samples with affymetrix genechips

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stemmer, Kerstin; Ellinger-Ziegelbauer, Heidrun; Lotz, Kerstin

    2006-11-15

    Laser microdissection in conjunction with microarray technology allows selective isolation and analysis of specific cell populations, e.g., preneoplastic renal lesions. To date, only limited information is available on sample preparation and preservation techniques that result in both optimal histomorphological preservation of sections and high-quality RNA for microarray analysis. Furthermore, amplification of minute amounts of RNA from microdissected renal samples allowing analysis with genechips has only scantily been addressed to date. The objective of this study was therefore to establish a reliable and reproducible protocol for laser microdissection in conjunction with microarray technology using kidney tissue from Eker rats p.o. treatedmore » for 7 days and 6 months with 10 and 1 mg Aristolochic acid/kg bw, respectively. Kidney tissues were preserved in RNAlater or snap frozen. Cryosections were cut and stained with either H and E or cresyl violet for subsequent morphological and RNA quality assessment and laser microdissection. RNA quality was comparable in snap frozen and RNAlater-preserved samples, however, the histomorphological preservation of renal sections was much better following cryopreservation. Moreover, the different staining techniques in combination with sample processing time at room temperature can have an influence on RNA quality. Different RNA amplification protocols were shown to have an impact on gene expression profiles as demonstrated with Affymetrix Rat Genome 230{sub 2}.0 arrays. Considering all the parameters analyzed in this study, a protocol for RNA isolation from laser microdissected samples with subsequent Affymetrix chip hybridization was established that was also successfully applied to preneoplastic lesions laser microdissected from Aristolochic acid-treated rats.« less

  14. Applications of life cycle assessment and cost analysis in health care waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soares, Sebastiao Roberto, E-mail: soares@ens.ufsc.br; Finotti, Alexandra Rodrigues, E-mail: finotti@ens.ufsc.br; Prudencio da Silva, Vamilson, E-mail: vamilson@epagri.sc.gov.br

    Highlights: Black-Right-Pointing-Pointer Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. Black-Right-Pointing-Pointer HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. Black-Right-Pointing-Pointer Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of USmore » $$ 0.12 kg{sup -1} for the waste treated with microwaves, US$$ 1.10 kg{sup -1} for the waste treated by the autoclave and US$ 1.53 kg{sup -1} for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible alternative to subsidize the formulation of the policy for small generators of HCW.« less

  15. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  16. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare.

    PubMed

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2016-01-01

    In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.

  17. Visualization of system dynamics using phasegrams

    PubMed Central

    Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh

    2013-01-01

    A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715

  18. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Development of image processing techniques for applications in flow visualization and analysis

    NASA Technical Reports Server (NTRS)

    Disimile, Peter J.; Shoe, Bridget; Toy, Norman; Savory, Eric; Tahouri, Bahman

    1991-01-01

    A comparison between two flow visualization studies of an axi-symmetric circular jet issuing into still fluid, using two different experimental techniques, is described. In the first case laser induced fluorescence is used to visualize the flow structure, whilst smoke is utilized in the second. Quantitative information was obtained from these visualized flow regimes using two different digital imaging systems. Results are presented of the rate at which the jet expands in the downstream direction and these compare favorably with the more established data.

  20. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  1. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  2. Qualitative computer aided evaluation of dental impressions in vivo.

    PubMed

    Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H

    2006-01-01

    Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.

  3. Sonification of acoustic emission data

    NASA Astrophysics Data System (ADS)

    Raith, Manuel; Große, Christian

    2014-05-01

    While loading different specimens, acoustic emissions appear due to micro crack formation or friction of already existing crack edges. These acoustic emissions can be recorded using suitable ultrasonic transducers and transient recorders. The analysis of acoustic emissions can be used to investigate the mechanical behavior of different specimens under load. Our working group has undertaken several experiments, monitored with acoustic emission techniques. Different materials such as natural stone, concrete, wood, steel, carbon composites and bone were investigated. Also the experimental setup has been varied. Fire-spalling experiments on ultrahigh performance concrete and pullout experiments on bonded anchors have been carried out. Furthermore uniaxial compression tests on natural stone and animal bone had been conducted. The analysis tools include not only the counting of events but the analysis of full waveforms. Powerful localization algorithms and automatic onset picking techniques (based on Akaikes Information Criterion) were established to handle the huge amount of data. Up to several thousand events were recorded during experiments of a few minutes. More sophisticated techniques like moment tensor inversion have been established on this relatively small scale as well. Problems are related to the amount of data but also to signal-to-noise quality, boundary conditions (reflections) sensor characteristics and unknown and changing Greens functions of the media. Some of the acoustic emissions recorded during these experiments had been transferred into audio range. The transformation into the audio range was done using Matlab. It is the aim of the sonification to establish a tool that is on one hand able to help controlling the experiment in-situ and probably adjust the load parameters according to the number and intensity of the acoustic emissions. On the other hand sonification can help to improve the understanding of acoustic emission techniques for training purposes (students, co-workers). On goal is to establish a real-time frequency transformation into the audio range to avoid time consuming visual data processing during the experiments. It is also the intention to analyze the signals using psycho-acoustic methods with the help of specialists from electrical engineering. Reference: Raith, Manuel (2013). "Schallemissionsanalyse bei Pulloutexperimenten an Verbunddübeln" Masterarbeit. Technische Universität München, Lehrstuhl für Zerstörungsfreie Prüfung. Malm, Fabian (2012). "Schallemissionsanalyse am humanen Femur" Masterarbeit. Technische Universität München, Lehrstuhl für Zerstörungsfreie Prüfung. Richter R. (2009): Einsatz der Schallemissionsanalyse zur Detektion des Riss und Abplatzungsverhaltens von Beton unter Brandeinwirkung. Diplomarbeit. Materialprüfungsanstalt Universität Stuttgart Keywords: Acoustic emission, bonded anchors, femur, pullout test, fire-spalling

  4. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  5. Evaluation of nondestructive testing techniques for the space shuttle nonmetallic thermal protection system

    NASA Technical Reports Server (NTRS)

    Tiede, D. A.

    1972-01-01

    A program was conducted to evaluate nondestructive analysis techniques for the detection of defects in rigidized surface insulation (a candidate material for the Space Shuttle thermal protection system). Uncoated, coated, and coated and bonded samples with internal defects (voids, cracks, delaminations, density variations, and moisture content), coating defects (holes, cracks, thickness variations, and loss of adhesion), and bondline defects (voids and unbonds) were inspected by X-ray radiography, acoustic, microwave, high-frequency ultrasonic, beta backscatter, thermal, holographic, and visual techniques. The detectability of each type of defect was determined for each technique (when applicable). A possible relationship between microwave reflection measurements (or X-ray-radiography density measurements) and the tensile strength was established. A possible approach for in-process inspection using a combination of X-ray radiography, acoustic, microwave, and holographic techniques was recommended.

  6. Analysis of Whole-Sky Imager Data to Determine the Validity of PCFLOS models

    DTIC Science & Technology

    1992-12-01

    included in the data sample. 2-5 3.1. Data arrangement for a r x c contingency table ....................... 3-2 3.2. ARIMA models estimated for each...satellites. This model uses the multidimen- sional Boehm Sawtooth Wave Model to establish climatic probabilities through repetitive simula- tions of...analysis techniques to develop an ARIMAe model for each direction at the Columbia and Kirtland sites. Then, the models can be compared and analyzed to

  7. Quantitative Analysis of Situational Awareness (QUASA): Applying Signal Detection Theory to True/False Probes and Self-Ratings

    DTIC Science & Technology

    2004-06-01

    obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the

  8. Exploratory and spatial data analysis (EDA-SDA) for determining regional background levels and anomalies of potentially toxic elements in soils from Catorce-Matehuala, Mexico

    USGS Publications Warehouse

    Chiprés, J.A.; Castro-Larragoitia, J.; Monroy, M.G.

    2009-01-01

    The threshold between geochemical background and anomalies can be influenced by the methodology selected for its estimation. Environmental evaluations, particularly those conducted in mineralized areas, must consider this when trying to determinate the natural geochemical status of a study area, quantifying human impacts, or establishing soil restoration values for contaminated sites. Some methods in environmental geochemistry incorporate the premise that anomalies (natural or anthropogenic) and background data are characterized by their own probabilistic distributions. One of these methods uses exploratory data analysis (EDA) on regional geochemical data sets coupled with a geographic information system (GIS) to spatially understand the processes that influence the geochemical landscape in a technique that can be called a spatial data analysis (SDA). This EDA-SDA methodology was used to establish the regional background range from the area of Catorce-Matehuala in north-central Mexico. Probability plots of the data, particularly for those areas affected by human activities, show that the regional geochemical background population is composed of smaller subpopulations associated with factors such as soil type and parent material. This paper demonstrates that the EDA-SDA method offers more certainty in defining thresholds between geochemical background and anomaly than a numeric technique, making it a useful tool for regional geochemical landscape analysis and environmental geochemistry studies.

  9. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  10. Computer-based analysis of holography using ray tracing.

    PubMed

    Latta, J N

    1971-12-01

    The application of a ray-tracing methodology to holography is presented. Emphasis is placed on establishing a very general foundation from which to build a general computer-based implementation. As few restrictions as possible are placed on the recording and reconstruction geometry. The necessary equations are established from the construction and reconstruction parameters of the hologram. The aberrations are defined following H. H. Hopkins, and these aberration specification techniques are compared with those used previously to analyze holography. Representative of the flexibility of the ray-tracing approach, two examples are considered. The first compares the answers between a wavefront matching and the ray-tracing analysis in the case of aberration balancing to compensate for chromatic aberrations. The results are very close and establish the basic utility of aberration balancing. Further indicative of the power of a ray tracing, a thick media analysis is included in the computer programs. This section is then used to perform a study of the effects of hologram emulsion shrinkage and methods for compensation. The results of compensating such holograms are to introduce aberrations, and these are considered in both reflection and transmission holograms.

  11. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  12. Combined Bisulfite Restriction Analysis for brain tissue identification.

    PubMed

    Samsuwan, Jarunya; Muangsub, Tachapol; Yanatatsaneejit, Pattamawadee; Mutirangura, Apiwat; Kitkumthorn, Nakarin

    2018-05-01

    According to the tissue-specific methylation database (doi: 10.1016/j.gene.2014.09.060), methylation at CpG locus cg03096975 in EML2 has been preliminarily proven to be specific to brain tissue. In this study, we enlarged sample size and developed a technique for identifying brain tissue in aged samples. Combined Bisulfite Restriction Analysis-for EML2 (COBRA-EML2) technique was established and validated in various organ samples obtained from 108 autopsies. In addition, this technique was also tested for its reliability, minimal DNA concentration detected, and use in aged samples and in samples obtained from specific brain compartments and spinal cord. COBRA-EML2 displayed 100% sensitivity and specificity for distinguishing brain tissue from other tissues, showed high reliability, was capable of detecting minimal DNA concentration (0.015ng/μl), could be used for identifying brain tissue in aged samples. In summary, COBRA-EML2 is a technique to identify brain tissue. This analysis is useful in criminal cases since it can identify the vital organ tissues from small samples acquired from criminal scenes. The results from this analysis can be counted as a medical and forensic marker supporting criminal investigations, and as one of the evidences in court rulings. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. THE IDENTIFICATION OF THE WATER-BORNE PATHOGEN AEROMONAS USING WHOLE CELL ANALYSIS BY MATRIX ASSISTED LASER DESORPTION/IONIZATION-MASS

    EPA Science Inventory

    MALDI-MS has long been established as a tool by which microorganisms can be characterized and identified. The U.S. Environmental Protection Agency (EPA) is investigating the potential of using this technique as a way to rapidly identify Aeromonas species in drinking water. A nu...

  14. Pedagogical Approach to the Modeling and Simulation of Oscillating Chemical Systems with Modern Software: The Brusselator Model

    ERIC Educational Resources Information Center

    Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca

    2018-01-01

    A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…

  15. Online Graduate Teacher Education: Establishing an EKG for Student Success Intervention

    ERIC Educational Resources Information Center

    Shelton, Brett E.; Hung, Jui-Long; Baughman, Sarah

    2016-01-01

    Predicting which students enrolled in graduate online education are at-risk for failure is an arduous yet important task for teachers and administrators alike. This research reports on a statistical analysis technique using both static and dynamic variables to determine which students are at-risk and when an intervention could be most helpful…

  16. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  17. Combining Capillary Electrophoresis with Mass Spectrometry for Applications in Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, David C.; Smith, Richard D.

    2005-04-01

    Throughout the field of global proteomics, ranging from simple organism studies to human medical applications, the high sample complexity creates demands for improved separations and analysis techniques. Furthermore, with increased organism complexity, the correlation between proteome and genome becomes less certain due to extensive mRNA processing prior to translation. In this way, the same DNA sequence can potentially code for regions in a number of distinct proteins; quantitative differences in expression (or abundance) between these often-related species are of significant interest. Well-established proteomics techniques, which use genomic information to identify peptides that originate from protease digestion, often cannot easily distinguishmore » between such gene products; intact protein-level analyses are required to complete the picture, particularly for identifying post-translational modifications. While chromatographic techniques are currently better suited to peptide analysis, capillary electrophoresis (CE) in combination with mass spectrometry (MS) may become important for intact protein analysis. This review focuses on CE/MS instrumentation and techniques showing promise for such applications, highlighting those with greatest potential. Reference will also be made to developments relevant to peptide-level analyses for use in time- or sample-limited situations.« less

  18. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  19. Interlake production established using quantitative hydrocarbon well-log analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancaster, J.; Atkinson, A.

    1988-07-01

    Production was established in a new pay zone of the basal Interlake Formation adjacent to production in Midway field in Williams County, North Dakota. Hydrocarbon saturation, which was computed using hydrocarbon well-log (mud-log) data, and computed permeability encouraged the operator to run casing and test this zone. By use of drilling rig parameters, drilling mud properties, hydrocarbon-show data from the mud log, drilled rock and porosity descriptions, and wireline log porosity, this new technique computes oil saturation (percent of porosity) and permeability to the invading filtrate, using the Darcy equation. The Leonardo Fee well was drilled to test the Devonianmore » Duperow, the Silurian upper Interlake, and the Ordovician Red River. The upper two objectives were penetrated downdip from Midway production and there were no hydrocarbon shows. It was determined that the Red River was tight, based on sample examination by well site personnel. The basal Interlake, however, liberated hydrocarbon shows that were analyzed by this new technology. The results of this evaluation accurately predicted this well would be a commercial success when placed in production. Where geophysical log analysis might be questionable, this new evaluation technique may provide answers to anticipated oil saturation and producibility. The encouraging results of hydrocarbon saturation and permeability, produced by this technique, may be largely responsible for the well being in production today.« less

  20. Application of discrete Fourier inter-coefficient difference for assessing genetic sequence similarity.

    PubMed

    King, Brian R; Aburdene, Maurice; Thompson, Alex; Warres, Zach

    2014-01-01

    Digital signal processing (DSP) techniques for biological sequence analysis continue to grow in popularity due to the inherent digital nature of these sequences. DSP methods have demonstrated early success for detection of coding regions in a gene. Recently, these methods are being used to establish DNA gene similarity. We present the inter-coefficient difference (ICD) transformation, a novel extension of the discrete Fourier transformation, which can be applied to any DNA sequence. The ICD method is a mathematical, alignment-free DNA comparison method that generates a genetic signature for any DNA sequence that is used to generate relative measures of similarity among DNA sequences. We demonstrate our method on a set of insulin genes obtained from an evolutionarily wide range of species, and on a set of avian influenza viral sequences, which represents a set of highly similar sequences. We compare phylogenetic trees generated using our technique against trees generated using traditional alignment techniques for similarity and demonstrate that the ICD method produces a highly accurate tree without requiring an alignment prior to establishing sequence similarity.

  1. Torque Transient of Magnetically Drive Flow for Viscosity Measurement

    NASA Technical Reports Server (NTRS)

    Ban, Heng; Li, Chao; Su, Ching-Hua; Lin, Bochuan; Scripa, Rosalia N.; Lehoczky, Sandor L.

    2004-01-01

    Viscosity is a good indicator of structural changes for complex liquids, such as semiconductor melts with chain or ring structures. This paper discusses the theoretical and experimental results of the transient torque technique for non-intrusive viscosity measurement. Such a technique is essential for the high temperature viscosity measurement of high pressure and toxic semiconductor melts. In this paper, our previous work on oscillating cup technique was expanded to the transient process of a magnetically driven melt flow in a damped oscillation system. Based on the analytical solution for the fluid flow and cup oscillation, a semi-empirical model was established to extract the fluid viscosity. The analytical and experimental results indicated that such a technique has the advantage of short measurement time and straight forward data analysis procedures

  2. Influences on corporate executive decision behavior in government acquisitions

    NASA Technical Reports Server (NTRS)

    Wetherington, J. R.

    1986-01-01

    This paper presents extensive exploratory research which had as its primary objective, the discovery and determination of major areas of concern exhibited by U.S. corporate executives in the preparation and submittal of proposals and bids to the Federal government. The existence of numerous unique concerns inherent in corporate strategies within the government market environment was established. A determination of the relationship of these concerns to each other was accomplished utilizing statistical factor analysis techniques resulting in the identification of major groupings of management concerns. Finally, using analysis of variance, an analysis and discovery of the interrelationship of the factors to corporate demographics was accomplished. The existence of separate and distinct concerns exhibited by corporate executives when contemplating sales and operations in the government marketplace was established. It was also demonstrated that quantifiable relationships exist between such variables and that the decision behavior exhibited by the responsible executives has an interrelationship to their company's demographics.

  3. Single cell analysis of normal and leukemic hematopoiesis.

    PubMed

    Povinelli, Benjamin J; Rodriguez-Meira, Alba; Mead, Adam J

    2018-02-01

    The hematopoietic system is well established as a paradigm for the study of cellular hierarchies, their disruption in disease and therapeutic use in regenerative medicine. Traditional approaches to study hematopoiesis involve purification of cell populations based on a small number of surface markers. However, such population-based analysis obscures underlying heterogeneity contained within any phenotypically defined cell population. This heterogeneity can only be resolved through single cell analysis. Recent advances in single cell techniques allow analysis of the genome, transcriptome, epigenome and proteome in single cells at an unprecedented scale. The application of these new single cell methods to investigate the hematopoietic system has led to paradigm shifts in our understanding of cellular heterogeneity in hematopoiesis and how this is disrupted in disease. In this review, we summarize how single cell techniques have been applied to the analysis of hematopoietic stem/progenitor cells in normal and malignant hematopoiesis, with a particular focus on recent advances in single-cell genomics, including how these might be utilized for clinical application. Copyright © 2017. Published by Elsevier Ltd.

  4. State-plane analysis of zero-voltage-switching resonant dc/dc power converters

    NASA Astrophysics Data System (ADS)

    Kazimierczuk, Marian K.; Morse, William D.

    The state-plane analysis technique for the zero-voltage-switching resonant dc/dc power converter family of topologies, namely the buck, boost, buck-boost, and Cuk converters is established. The state plane provides a compression of information that allows the designer to uniquely examine the nonlinear dynamics of resonant converter operation. Utilizing the state plane, resonant converter modes of operation are examined and the switching frequencies are derived for the boundaries between these modes, including the boundary of energy conversion.

  5. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  6. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  7. Mixed strategies for energy conservation and alternative energy utilization (solar) in buildings. Final report. Volume II. Detailed results. [New York, Atlanta, Omaha, and Albuquerque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1977-06-01

    The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less

  8. The potential use of cuticular hydrocarbons and multivariate analysis to age empty puparial cases of Calliphora vicina and Lucilia sericata.

    PubMed

    Moore, Hannah E; Pechal, Jennifer L; Benbow, M Eric; Drijfhout, Falko P

    2017-05-16

    Cuticular hydrocarbons (CHC) have been successfully used in the field of forensic entomology for identifying and ageing forensically important blowfly species, primarily in the larval stages. However in older scenes where all other entomological evidence is no longer present, Calliphoridae puparial cases can often be all that remains and therefore being able to establish the age could give an indication of the PMI. This paper examined the CHCs present in the lipid wax layer of insects, to determine the age of the cases over a period of nine months. The two forensically important species examined were Calliphora vicina and Lucilia sericata. The hydrocarbons were chemically extracted and analysed using Gas Chromatography - Mass Spectrometry. Statistical analysis was then applied in the form of non-metric multidimensional scaling analysis (NMDS), permutational multivariate analysis of variance (PERMANOVA) and random forest models. This study was successful in determining age differences within the empty cases, which to date, has not been establish by any other technique.

  9. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  10. Methodological issues in volumetric magnetic resonance imaging of the brain in the Edinburgh High Risk Project.

    PubMed

    Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M

    1999-07-30

    The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.

  11. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  12. Critical Trends and Events Affecting the Future of Texas Higher Education. Proceedings of the Texas Association for Institutional Research (TAIR) Preconference Workshop on Environmental Scanning (1995).

    ERIC Educational Resources Information Center

    Morrison, James L.

    This proceedings report describes exercises used in a workshop on environmental scanning, designed to assist institutional research officers to develop competency in establishing and maintaining an external analysis capability on their campuses. The workshop offered an opportunity for participants to experience several techniques used in…

  13. Analysis of the Implementation of a Dynamic Assessment Device of Processes Involved in Reading with Learning-Disabled Children

    ERIC Educational Resources Information Center

    Navarro, Juan-Jose; Mora, Joaquin

    2011-01-01

    The renewed interest in the dynamic assessment of specific domains has led to reconsideration of this theory and the technique's contribution to the learning-teaching process. In this article, we analyze some elements concerning the internal structure of a dynamic assessment device of processes involved in reading tasks, establishing some of the…

  14. Making an Impression: Portfolios as Instruments of Impression Management for Teachers in Early Childhood Education and Care Centres

    ERIC Educational Resources Information Center

    Knauf, Helen

    2017-01-01

    The study presented here examines the contribution of portfolios to the communication between parents and early childhood education and care centres. Using content analysis techniques, 2104 portfolio entries are examined with a view to establishing what impression they are intended to create. While the actual purpose of portfolios emphasizes the…

  15. Artificial Intelligence: An Analysis of the Technology for Training. Training and Development Research Center Project Number Fourteen.

    ERIC Educational Resources Information Center

    Sayre, Scott Alan

    The ultimate goal of the science of artificial intelligence (AI) is to establish programs that will use algorithmic computer techniques to imitate the heuristic thought processes of humans. Most AI programs, especially expert systems, organize their knowledge into three specific areas: data storage, a rule set, and a control structure. Limitations…

  16. Determination of cell metabolite VEGF₁₆₅ and dynamic analysis of protein-DNA interactions by combination of microfluidic technique and luminescent switch-on probe.

    PubMed

    Lin, Xuexia; Leung, Ka-Ho; Lin, Ling; Lin, Luyao; Lin, Sheng; Leung, Chung-Hang; Ma, Dik-Lung; Lin, Jin-Ming

    2016-05-15

    In this paper, we rationally design a novel G-quadruplex-selective luminescent iridium (III) complex for rapid detection of oligonucleotide and VEGF165 in microfluidics. This new probe is applied as a convenient biosensor for label-free quantitative analysis of VEGF165 protein from cell metabolism, as well as for studying the kinetics of the aptamer-protein interaction combination with a microfluidic platform. As a result, we have successfully established a quantitative analysis of VEGF165 from cell metabolism. Furthermore, based on the principles of hydrodynamic focusing and diffusive mixing, different transient states during kinetics process were monitored and recorded. Thus, the combination of microfluidic technique and G-quadruplex luminescent probe will be potentially applied in the studies of intramolecular interactions and molecule recognition in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. New Uses for Sensitivity Analysis: How Different Movement Tasks Effect Limb Model Parameter Sensitivity

    NASA Technical Reports Server (NTRS)

    Winters, J. M.; Stark, L.

    1984-01-01

    Original results for a newly developed eight-order nonlinear limb antagonistic muscle model of elbow flexion and extension are presented. A wider variety of sensitivity analysis techniques are used and a systematic protocol is established that shows how the different methods can be used efficiently to complement one another for maximum insight into model sensitivity. It is explicitly shown how the sensitivity of output behaviors to model parameters is a function of the controller input sequence, i.e., of the movement task. When the task is changed (for instance, from an input sequence that results in the usual fast movement task to a slower movement that may also involve external loading, etc.) the set of parameters with high sensitivity will in general also change. Such task-specific use of sensitivity analysis techniques identifies the set of parameters most important for a given task, and even suggests task-specific model reduction possibilities.

  18. A new criterion to evaluate water vapor interference in protein secondary structural analysis by FTIR spectroscopy.

    PubMed

    Zou, Ye; Ma, Gang

    2014-06-04

    Second derivative and Fourier self-deconvolution (FSD) are two commonly used techniques to resolve the overlapped component peaks from the often featureless amide I band in Fourier transform infrared (FTIR) curve-fitting approach for protein secondary structural analysis. Yet, the reliability of these two techniques is greatly affected by the omnipresent water vapor in the atmosphere. Several criteria are currently in use as quality controls to ensure the protein absorption spectrum is negligibly affected by water vapor interference. In this study, through a second derivative study of liquid water, we first argue that the previously established criteria cannot guarantee a reliable evaluation of water vapor interference due to a phenomenon that we refer to as sample's absorbance-dependent water vapor interference. Then, through a comparative study of protein and liquid water, we show that a protein absorption spectrum can still be significantly affected by water vapor interference even though it satisfies the established criteria. At last, we propose to use the comparison between the second derivative spectra of protein and liquid water as a new criterion to better evaluate water vapor interference for more reliable second derivative and FSD treatments on the protein amide I band.

  19. The moss Physcomitrella patens: methods and tools from cultivation to targeted analysis of gene function.

    PubMed

    Strotbek, Christoph; Krinninger, Stefan; Frank, Wolfgang

    2013-01-01

    To comprehensively understand the major processes in plant biology, it is necessary to study a diverse set of species that represent the complexity of plants. This research will help to comprehend common conserved mechanisms and principles, as well as to elucidate those mechanisms that are specific to a particular plant clade. Thereby, we will gain knowledge about the invention and loss of mechanisms and their biological impact causing the distinct specifications throughout the plant kingdom. Since the establishment of transgenic plants, these studies concentrate on the elucidation of gene functions applying an increasing repertoire of molecular techniques. In the last two decades, the moss Physcomitrella patens joined the established set of plant models based on its evolutionary position bridging unicellular algae and vascular plants and a number of specific features alleviating gene function analysis. Here, we want to provide an overview of the specific features of P. patens making it an interesting model for many research fields in plant biology, to present the major achievements in P. patens genetic engineering, and to introduce common techniques to scientists who intend to use P. patens as a model in their research activities.

  20. Accelerated Bayesian model-selection and parameter-estimation in continuous gravitational-wave searches with pulsar-timing arrays

    NASA Astrophysics Data System (ADS)

    Taylor, Stephen; Ellis, Justin; Gair, Jonathan

    2014-11-01

    We describe several new techniques which accelerate Bayesian searches for continuous gravitational-wave emission from supermassive black-hole binaries using pulsar-timing arrays. These techniques mitigate the problematic increase of search dimensionality with the size of the pulsar array which arises from having to include an extra parameter per pulsar as the array is expanded. This extra parameter corresponds to searching over the phase of the gravitational wave as it propagates past each pulsar so that we can coherently include the pulsar term in our search strategies. Our techniques make the analysis tractable with powerful evidence-evaluation packages like MultiNest. We find good agreement of our techniques with the parameter-estimation and Bayes factor evaluation performed with full signal templates and conclude that these techniques make excellent first-cut tools for detection and characterization of continuous gravitational-wave signals with pulsar-timing arrays. Crucially, at low to moderate signal-to-noise ratios the factor by which the analysis is sped up can be ≳100 , permitting rigorous programs of systematic injection and recovery of signals to establish robust detection criteria within a Bayesian formalism.

  1. Qualitative evaluation of water displacement in simulated analytical breaststroke movements.

    PubMed

    Martens, Jonas; Daly, Daniel

    2012-05-01

    One purpose of evaluating a swimmer is to establish the individualized optimal technique. A swimmer's particular body structure and the resulting movement pattern will cause the surrounding water to react in differing ways. Consequently, an assessment method based on flow visualization was developed complimentary to movement analysis and body structure quantification. A fluorescent dye was used to make the water displaced by the body visible on video. To examine the hypothesis on the propulsive mechanisms applied in breaststroke swimming, we analyzed the movements of the surrounding water during 4 analytical breaststroke movements using the flow visualization technique.

  2. Finite-element analysis and modal testing of a rotating wind turbine

    NASA Astrophysics Data System (ADS)

    Carne, T. G.; Lobitz, D. W.; Nord, A. R.; Watson, R. A.

    1982-10-01

    A finite element procedure, which includes geometric stiffening, and centrifugal and Coriolis terms resulting from the use of a rotating coordinate system, was developed to compute the mode shapes and frequencies of rotating structures. Special applications of this capability was made to Darrieus, vertical axis wind turbines. In a parallel development effort, a technique for the modal testing of a rotating vertical axis wind turbine is established to measure modal parameters directly. Results from the predictive and experimental techniques for the modal frequencies and mode shapes are compared over a wide range of rotational speeds.

  3. Finite element analysis and modal testing of a rotating wind turbine

    NASA Astrophysics Data System (ADS)

    Carne, T. G.; Lobitz, D. W.; Nord, A. R.; Watson, R. A.

    A finite element procedure, which includes geometric stiffening, and centrifugal and Coriolis terms resulting from the use of a rotating coordinate system, has been developed to compute the mode shapes and frequencies of rotating structures. Special application of this capability has been made to Darrieus, vertical axis wind turbines. In a parallel development effort, a technique for the modal testing of a rotating vertical axis wind turbine has been established to measure modal parameters directly. Results from the predictive and experimental techniques for the modal frequencies and mode shapes are compared over a wide range of rotational speeds.

  4. PCR-based detection of micro-organisms in extreme environments during the EuroGeoMars MDRS campaign

    NASA Astrophysics Data System (ADS)

    Thiel, Cora S.; Ullrich, Oliver

    Deoxyribonucleic acid (DNA) is found in all known living organisms and some viruses on earth. The main function of DNA molecules is the long-term storage of genetic information. They are passed on from generation to generation as the hereditary material. The polymerase chain reaction (PCR) is a revolutionary technique which allows amplifying a single or few copies of DNA molecules across several orders of magnitude, generating millions of copies of the original DNA fragment allowing detection of minimal traces of DNA. The compactness of the nowadays PCR instruments makes routine sample analysis possible with only a minimum of laboratory space. Our goal was to establish a routine for detection of DNA from micro-organisms based on the effective but also robust and simple PCR technique during the EuroGeoMars simula-tion campaign at The Mars Society's Mars Desert Research Station (MDRS) in February 2009. During the MDRS simulation we were able to show that it is possible to establish a minimal molecular biology lab in the habitat for an immediate on-site analysis by PCR after sample collection. Soil and water samples were taken from different locations and soil depths. The sample analysis was started immediately after returning to the habitat and was completed dur-ing the following days. DNA was isolated from micro-organisms and was used as a template for PCR analysis of the highly conserved ribosomal DNA to identify representatives of the different groups of micro-organisms (archaea, bacteria, eukaryotes). PCR products were visualized by agarose gel electrophoresis and documented by UV-transilluminator and digital camera. For the first time it was possible to demonstrate a direct on-site DNA analysis by PCR at MDRS, situated in an extreme environment that functions as a model for preparation and optimization of techniques to be used for future Mars exploration.

  5. A variational conformational dynamics approach to the selection of collective variables in metadynamics.

    PubMed

    McCarty, James; Parrinello, Michele

    2017-11-28

    In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.

  6. A variational conformational dynamics approach to the selection of collective variables in metadynamics

    NASA Astrophysics Data System (ADS)

    McCarty, James; Parrinello, Michele

    2017-11-01

    In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.

  7. Method for Determination of Less Than 5 ppm Oxygen in Sodium Samples

    NASA Technical Reports Server (NTRS)

    Reid, R. S.; Martin, J. J.; Schmidt, G. L.

    2005-01-01

    Alkali metals used in pumped loops or heat pipes must be sufficiently free of nonmetallic impurities to ensure long heat rejection system life. Life issues are well established for alkali metal systems. Impurities can form ternary compounds between the container and working fluid, leading to corrosion. This Technical Memorandum discusses the consequences of impurities and candidate measurement techniques to determine whether impurities have been reduced to suf.ciently low levels within a single-phase liquid metal loop or a closed two-phase heat transfer system, such as a heat pipe. These techniques include the vanadium wire equilibration, neutron activation analysis, plug traps, distillation, and chemical analysis. Conceptual procedures for performing vanadium wire equilibration purity measurements on sodium contained in a heat pipe are discussed in detail.

  8. Techniques for fire detection

    NASA Technical Reports Server (NTRS)

    Bukowski, Richard W.

    1987-01-01

    An overview is given of the basis for an analysis of combustable materials and potential ignition sources in a spacecraft. First, the burning process is discussed in terms of the production of the fire signatures normally associated with detection devices. These include convected and radiated thermal energy, particulates, and gases. Second, the transport processes associated with the movement of these from the fire to the detector, along with the important phenomena which cause the level of these signatures to be reduced, are described. Third, the operating characteristics of the individual types of detectors which influence their response to signals, are presented. Finally, vulnerability analysis using predictive fire modeling techniques is discussed as a means to establish the necessary response of the detection system to provide the level of protection required in the application.

  9. A data analysis expert system for large established distributed databases

    NASA Technical Reports Server (NTRS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  10. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  11. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  12. Informatics for Metabolomics.

    PubMed

    Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote

    2016-01-01

    Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.

  13. Non-Destructive Spectroscopic Techniques and Multivariate Analysis for Assessment of Fat Quality in Pork and Pork Products: A Review

    PubMed Central

    Kucha, Christopher T.; Liu, Li; Ngadi, Michael O.

    2018-01-01

    Fat is one of the most important traits determining the quality of pork. The composition of the fat greatly influences the quality of pork and its processed products, and contribute to defining the overall carcass value. However, establishing an efficient method for assessing fat quality parameters such as fatty acid composition, solid fat content, oxidative stability, iodine value, and fat color, remains a challenge that must be addressed. Conventional methods such as visual inspection, mechanical methods, and chemical methods are used off the production line, which often results in an inaccurate representation of the process because the dynamics are lost due to the time required to perform the analysis. Consequently, rapid, and non-destructive alternative methods are needed. In this paper, the traditional fat quality assessment techniques are discussed with emphasis on spectroscopic techniques as an alternative. Potential spectroscopic techniques include infrared spectroscopy, nuclear magnetic resonance and Raman spectroscopy. Hyperspectral imaging as an emerging advanced spectroscopy-based technology is introduced and discussed for the recent development of assessment for fat quality attributes. All techniques are described in terms of their operating principles and the research advances involving their application for pork fat quality parameters. Future trends for the non-destructive spectroscopic techniques are also discussed. PMID:29382092

  14. The measurement of bacterial translation by photon correlation spectroscopy.

    PubMed Central

    Stock, G B; Jenkins, T C

    1978-01-01

    Photon correlation spectroscopy is shown to be a practical technique for the accurate determination of translational speeds of bacteria. Though other attempts have been made to use light scattering as a probe of various aspects of bacterial motility, no other comprehensive studies to establish firmly the basic capabilities and limitations of the technique have been published. The intrinsic accuracy of the assay of translational speeds by photon correlation spectroscopy is investigated by analysis of synthetic autocorrelation data; consistently accurate estimates of the mean and second moment of the speed distribution can be calculated. Extensive analyses of experimental preparations of Salmonella typhimurium examine the possible sources of experimental difficulty with the assay. Cinematography confirms the bacterial speed estimates obtained by photon correlation techniques. PMID:346073

  15. Design of a submillimeter laser Thomson scattering system for measurement of ion temperature in SUMMA

    NASA Technical Reports Server (NTRS)

    Praddaude, H. C.; Woskoboinikow, P.

    1978-01-01

    A thorough discussion of submillimeter laser Thomson scattering for the measurement of ion temperature in plasmas is presented. This technique is very promising and work is being actively pursued on the high power lasers and receivers necessary for its implementation. In this report we perform an overall system analysis of the Thomson scattering technique aimed to: (1) identify problem areas; (2) establish specifications for the main components of the apparatus; (3) study signal processing alternatives and identify the optimum signal handling procedure. Because of its importance for the successful implementation of this technique, we also review the work presently being carried out on the optically pumped submillimeter CH3F and D2O lasers.

  16. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  17. Re-Operationalizing Established Groups in Brainstorming: Validating Osborn's Claims

    ERIC Educational Resources Information Center

    Levine, Kenneth J.; Heuett, Kyle B.; Reno, Katie M.

    2017-01-01

    Since the introduction of brainstorming as an idea-generation technique to address organizational problems, researchers have struggled to replicate some of the claims around the technique. One major concern has been the differences in the number of ideas generated between established groups as found in industry versus the non-established groups…

  18. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X.

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  19. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia.

    PubMed

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  20. Laser ablation-inductively coupled plasma mass spectrometry for the characterization of pigments in prehistoric rock art.

    PubMed

    Resano, Martin; García-Ruiz, Esperanza; Alloza, Ramiro; Marzo, Maria P; Vandenabeele, Peter; Vanhaecke, Frank

    2007-12-01

    In this work, several red-colored paintings of post-Paleolithic schematic style found in 10 different shelters in the vicinity of the Vero River (Huesca) were sampled and subjected to analysis by means of scanning electron microscopy-energy-dispersive X-ray spectrometry (SEM-EDX), Raman spectroscopy, and laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS). The goal of this research was to obtain meaningful information on the samples composition, in order to establish differences or similarities among them. The combined use of these techniques proved beneficial, as Raman data permitted structural information on the compounds present (hematite was identified as the main pigment, whereas calcite and gypsum are the main components of the substrate layer, as well as of the accretions that covered the pigments) to be obtained, while the quantitative values obtained by SEM were suitable for the use of Ca as internal reference during LA-ICPMS analysis. However, it was this latter technique that provided the most relevant data for fingerprinting purposes. The potential of this technique for obtaining spatially resolved information allowed the multielement quantitative analysis of the pigment layer, in spite of the presence of superficial accretions. The sensitivity of the technique permitted the determination of more than 40 elements present in a wide concentration range (from microgram per gram to 10% level) with minimum sample consumption (approximately 900 ng for each sample, corresponding to five replicates). Finally, in order to establish significant differences, only those elements showing a high correlation with Fe (As, Co, Mo, Sb, Tl, and Zr, in this case) were selected, as it is expected that these were truly present in the original pigment, while others could have migrated into the pigment layer throughout time. By using this information, it seems feasible to discriminate between various paint pots, as demonstrated for the samples under investigation.

  1. FT-IR spectroscopy characterization of schwannoma: a case study

    NASA Astrophysics Data System (ADS)

    Ferreira, Isabelle; Neto, Lazaro P. M.; das Chagas, Maurilio José; Carvalho, Luís. Felipe C. S.; dos Santos, Laurita; Ribas, Marcelo; Loddi, Vinicius; Martin, Airton A.

    2016-03-01

    Schwannoma are rare benign neural neoplasia. The clinical diagnosis could be improved if novel optical techniques are performed. Among these techniques, FT-IR is one of the currently techniques which has been applied for samples discrimination using biochemical information with minimum sample preparation. In this work, we report a case of a schwannoma in the cervical region. A histological examination described a benign process. An immunohistochemically examination demonstrated positivity to anti-S100 protein antibody, indicating a diagnosis of schwannoma. The aim of this analysis was to characterize FT-IR spectrum of the neoplastic and normal tissue in the fingerprint (1000-1800 cm-1) and high wavenumber region (2800-3600 cm-1). The IR spectra were collect from tumor tissue and normal nerve samples by a FT-IR spectrophotometer (Spotlight Perkin Elmer 400, USA) with 64 scans, and resolution of 4 cm-1. A total of twenty spectra were recorded (10 from schwannoma and 10 from nerve). Multivariate Analysis was used to classify the data. Through average and standard deviation analysis we observed that the main spectral change occurs at ≍1600 cm-1 (amide I) and ≍1400 cm-1 (amide III) in the fingerprint region, and in CH2/CH3 protein-lipids and OH-water vibrations for the high wavenumber region. In conclusion, FT-IR could be used as a technique for schwannoma analysis helping to establish specific diagnostic.

  2. Random vibration analysis of space flight hardware using NASTRAN

    NASA Technical Reports Server (NTRS)

    Thampi, S. K.; Vidyasagar, S. N.

    1990-01-01

    During liftoff and ascent flight phases, the Space Transportation System (STS) and payloads are exposed to the random acoustic environment produced by engine exhaust plumes and aerodynamic disturbances. The analysis of payloads for randomly fluctuating loads is usually carried out using the Miles' relationship. This approximation technique computes an equivalent load factor as a function of the natural frequency of the structure, the power spectral density of the excitation, and the magnification factor at resonance. Due to the assumptions inherent in Miles' equation, random load factors are often over-estimated by this approach. In such cases, the estimates can be refined using alternate techniques such as time domain simulations or frequency domain spectral analysis. Described here is the use of NASTRAN to compute more realistic random load factors through spectral analysis. The procedure is illustrated using Spacelab Life Sciences (SLS-1) payloads and certain unique features of this problem are described. The solutions are compared with Miles' results in order to establish trends at over or under prediction.

  3. ELICIT: An alternative imprecise weight elicitation technique for use in multi-criteria decision analysis for healthcare

    PubMed Central

    Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard

    2015-01-01

    Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235

  4. Provenance Establishment of Stingless Bee Honey Using Multi-element Analysis in Combination with Chemometrics Techniques.

    PubMed

    Shadan, Aidil Fahmi; Mahat, Naji A; Wan Ibrahim, Wan Aini; Ariffin, Zaiton; Ismail, Dzulkiflee

    2018-01-01

    As consumption of stingless bee honey has been gaining popularity in many countries including Malaysia, ability to identify accurately its geographical origin proves pertinent for investigating fraudulent activities for consumer protection. Because a chemical signature can be location-specific, multi-element distribution patterns may prove useful for provenancing such product. Using the inductively coupled-plasma optical emission spectrometer as well as principal component analysis (PCA) and linear discriminant analysis (LDA), the distributions of multi-elements in stingless bee honey collected at four different geographical locations (North, West, East, and South) in Johor, Malaysia, were investigated. While cross-validation using PCA demonstrated 87.0% correct classification rate, the same was improved (96.2%) with the use of LDA, indicating that discrimination was possible for the different geographical regions. Therefore, utilization of multi-element analysis coupled with chemometrics techniques for assigning the provenance of stingless bee honeys for forensic applications is supported. © 2017 American Academy of Forensic Sciences.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasdekis, Andreas E.; Stephanopoulos, Gregory

    The sampling and manipulation of cells down to the individual has been of substantial interest since the very beginning of Life Sciences. Herein, our objective is to highlight the most recent developments in single cell manipulation, as well as pioneering ones. First, flow-through methods will be discussed, namely methods in which the single cells flow continuously in an ordered manner during their analysis. This section will be followed by confinement techniques that enable cell isolation and confinement in one, two- or three-dimensions. Flow cytometry and droplet microfluidics are the two most common methods of flow-through analysis. While both are high-throughputmore » techniques, their difference lays in the fact that the droplet encapsulated cells experience a restricted and personal microenvironment, while in flow cytometry cells experience similar nutrient and stimuli initial concentrations. These methods are rather well established; however, they recently enabled immense strides in single cell phenotypic analysis, namely the identification and analysis of metabolically distinct individuals from an isogenic population using both droplet microfluidics and flow cytometry.« less

  6. The analysis and forecasting of male cycling time trial records established within England and Wales.

    PubMed

    Dyer, Bryce; Hassani, Hossein; Shadi, Mehran

    2016-01-01

    The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.

  7. PCR-based analysis of microbial communities during the EuroGeoMars campaign at Mars Desert Research Station, Utah

    NASA Astrophysics Data System (ADS)

    Thiel, Cora S.; Ehrenfreund, Pascale; Foing, Bernard; Pletser, Vladimir; Ullrich, Oliver

    2011-07-01

    The search for evidence of past or present life on Mars will require the detection of markers that indicate the presence of life. Because deoxyribonucleic acid (DNA) is found in all known living organisms, it is considered to be a ‘biosignature’ of life. The main function of DNA is the long-term storage of genetic information, which is passed on from generation to generation as hereditary material. The Polymerase Chain Reaction (PCR) is a revolutionary technique which allows a single fragment or a small number of fragments of a DNA molecule to be amplified millions of times, making it possible to detect minimal traces of DNA. The compactness of the contemporary PCR instruments makes routine sample analysis possible with a minimum amount of laboratory space. Furthermore the technique is effective, robust and straightforward. Our goal was to establish a routine for the detection of DNA from micro-organisms using the PCR technique during the EuroGeoMars simulation campaign. This took place at the Mars Society's Mars Desert Research Station (MDRS) in Utah in February 2009 (organized with the support of the International Lunar Exploration Working Group (ILEWG), NASA Ames and the European Space Research and Technology Centre (ESTEC)). During the MDRS simulation, we showed that it is possible to establish a minimal molecular biology lab in the habitat for the immediate on-site analysis of samples by PCR after sample collection. Soil and water samples were taken at different locations and soil depths. The sample analysis was started immediately after the crew returned to the habitat laboratory. DNA was isolated from micro-organisms and used as a template for PCR analysis of the highly conserved ribosomal DNA to identify representatives of the different groups of micro-organisms (bacteria, archaea and eukarya). The PCR products were visualized by agarose gel electrophoresis and documented by transillumination and digital imaging. The microbial diversity in the collected samples was analysed with respect to sampling depth and the presence or absence of vegetation. For the first time, we have demonstrated that it is possible to perform direct on-site DNA analysis by PCR at MDRS, a simulated planetary habitat in an extreme environment that serves as a model for preparation and optimization of techniques to be used for future Mars exploration.

  8. Application of the statistical process control method for prospective patient safety monitoring during the learning phase: robotic kidney transplantation with regional hypothermia (IDEAL phase 2a-b).

    PubMed

    Sood, Akshay; Ghani, Khurshid R; Ahlawat, Rajesh; Modi, Pranjal; Abaza, Ronney; Jeong, Wooju; Sammon, Jesse D; Diaz, Mireya; Kher, Vijay; Menon, Mani; Bhandari, Mahendra

    2014-08-01

    Traditional evaluation of the learning curve (LC) of an operation has been retrospective. Furthermore, LC analysis does not permit patient safety monitoring. To prospectively monitor patient safety during the learning phase of robotic kidney transplantation (RKT) and determine when it could be considered learned using the techniques of statistical process control (SPC). From January through May 2013, 41 patients with end-stage renal disease underwent RKT with regional hypothermia at one of two tertiary referral centers adopting RKT. Transplant recipients were classified into three groups based on the robotic training and kidney transplant experience of the surgeons: group 1, robot trained with limited kidney transplant experience (n=7); group 2, robot trained and kidney transplant experienced (n=20); and group 3, kidney transplant experienced with limited robot training (n=14). We employed prospective monitoring using SPC techniques, including cumulative summation (CUSUM) and Shewhart control charts, to perform LC analysis and patient safety monitoring, respectively. Outcomes assessed included post-transplant graft function and measures of surgical process (anastomotic and ischemic times). CUSUM and Shewhart control charts are time trend analytic techniques that allow comparative assessment of outcomes following a new intervention (RKT) relative to those achieved with established techniques (open kidney transplant; target value) in a prospective fashion. CUSUM analysis revealed an initial learning phase for group 3, whereas groups 1 and 2 had no to minimal learning time. The learning phase for group 3 varied depending on the parameter assessed. Shewhart control charts demonstrated no compromise in functional outcomes for groups 1 and 2. Graft function was compromised in one patient in group 3 (p<0.05) secondary to reasons unrelated to RKT. In multivariable analysis, robot training was significantly associated with improved task-completion times (p<0.01). Graft function was not adversely affected by either the lack of robotic training (p=0.22) or kidney transplant experience (p=0.72). The LC and patient safety of a new surgical technique can be assessed prospectively using CUSUM and Shewhart control chart analytic techniques. These methods allow determination of the duration of mentorship and identification of adverse events in a timely manner. A new operation can be considered learned when outcomes achieved with the new intervention are at par with outcomes following established techniques. Statistical process control techniques allowed for robust, objective, and prospective monitoring of robotic kidney transplantation and can similarly be applied to other new interventions during the introduction and adoption phase. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  9. Optimum element density studies for finite-element thermal analysis of hypersonic aircraft structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Olona, Timothy; Muramoto, Kyle M.

    1990-01-01

    Different finite element models previously set up for thermal analysis of the space shuttle orbiter structure are discussed and their shortcomings identified. Element density criteria are established for the finite element thermal modelings of space shuttle orbiter-type large, hypersonic aircraft structures. These criteria are based on rigorous studies on solution accuracies using different finite element models having different element densities set up for one cell of the orbiter wing. Also, a method for optimization of the transient thermal analysis computer central processing unit (CPU) time is discussed. Based on the newly established element density criteria, the orbiter wing midspan segment was modeled for the examination of thermal analysis solution accuracies and the extent of computation CPU time requirements. The results showed that the distributions of the structural temperatures and the thermal stresses obtained from this wing segment model were satisfactory and the computation CPU time was at the acceptable level. The studies offered the hope that modeling the large, hypersonic aircraft structures using high-density elements for transient thermal analysis is possible if a CPU optimization technique was used.

  10. New developments of X-ray fluorescence imaging techniques in laboratory

    NASA Astrophysics Data System (ADS)

    Tsuji, Kouichi; Matsuno, Tsuyoshi; Takimoto, Yuki; Yamanashi, Masaki; Kometani, Noritsugu; Sasaki, Yuji C.; Hasegawa, Takeshi; Kato, Shuichi; Yamada, Takashi; Shoji, Takashi; Kawahara, Naoki

    2015-11-01

    X-ray fluorescence (XRF) analysis is a well-established analytical technique with a long research history. Many applications have been reported in various fields, such as in the environmental, archeological, biological, and forensic sciences as well as in industry. This is because XRF has a unique advantage of being a nondestructive analytical tool with good precision for quantitative analysis. Recent advances in XRF analysis have been realized by the development of new x-ray optics and x-ray detectors. Advanced x-ray focusing optics enables the making of a micro x-ray beam, leading to micro-XRF analysis and XRF imaging. A confocal micro-XRF technique has been applied for the visualization of elemental distributions inside the samples. This technique was applied for liquid samples and for monitoring chemical reactions such as the metal corrosion of steel samples in the NaCl solutions. In addition, a principal component analysis was applied for reducing the background intensity in XRF spectra obtained during XRF mapping, leading to improved spatial resolution of confocal micro-XRF images. In parallel, the authors have proposed a wavelength dispersive XRF (WD-XRF) imaging spectrometer for a fast elemental imaging. A new two dimensional x-ray detector, the Pilatus detector was applied for WD-XRF imaging. Fast XRF imaging in 1 s or even less was demonstrated for Euro coins and industrial samples. In this review paper, these recent advances in laboratory-based XRF imaging, especially in a laboratory setting, will be introduced.

  11. Kinematic analysis of total knee prosthesis designed for Asian population.

    PubMed

    Low, F H; Khoo, L P; Chua, C K; Lo, N N

    2000-01-01

    In designing a total knee replacement (TKR) prosthesis catering for the Asian population, 62 sets of femur were harvested and analyzed. The morphometrical data obtained were found to be in good agreement with dimensions typical of the Asian knee and has reaffirmed the fact that Caucasian knees are generally larger than Asian knees. Subsequently, these data when treated using a multivariate statistical technique resulted in the establishment of major design parameters for six different sizes of femoral implants. An extra-small implant size with established dimensions and geometrical shape has surfaced from the study. The differences between the Asian knees and the Caucasian knees are discussed. Employing the established femoral dimensions and motion path of the knee joint, the articulating tibia profile was generated. All the sizes of implants were modeled using a computer-aided software package. Thereupon, these models that accurately fits the local Asian knee were transported into a dynamic and kinematic analysis software package. The tibiofemoral joint was modeled successfully as a slide curve joint to study intuitively the motion of the femur when articulating on the tibia surface. An optimal tibia profile could be synthesized to mimic the natural knee path motion. Details of the analysis are presented and discussed.

  12. Establishing ¹H nuclear magnetic resonance based metabonomics fingerprinting profile for spinal cord injury: a pilot study.

    PubMed

    Jiang, Hua; Peng, Jin; Zhou, Zhi-yuan; Duan, Yu; Chen, Wei; Cai, Bin; Yang, Hao; Zhang, Wei

    2010-09-01

    Spinal cord injury (SCI) is a complex trauma that consists of multiple pathological mechanisms involving cytotoxic, oxidation stress and immune-endocrine. This study aimed to establish plasma metabonomics fingerprinting atlas for SCI using (1)H nuclear magnetic resonance (NMR) based metabonomics methodology and principal component analysis techniques. Nine Sprague-Dawley (SD) male rats were randomly divided into SCI, normal and sham-operation control groups. Plasma samples were collected for (1)H NMR spectroscopy 3 days after operation. The NMR data were analyzed using principal component analysis technique with Matlab software. Metabonomics analysis was able to distinguish the three groups (SCI, normal control, sham-operation). The fingerprinting atlas indicated that, compared with those without SCI, the SCI group demonstrated the following characteristics with regard to second principal component: it is made up of fatty acids, myc-inositol, arginine, very low-density lipoprotein (VLDL), low-density lipoprotein (LDL), triglyceride (TG), glucose, and 3-methyl-histamine. The data indicated that SCI results in several significant changes in plasma metabolism early on and that a metabonomics approach based on (1)H NMR spectroscopy can provide a metabolic profile comprising several metabolite classes and allow for relative quantification of such changes. The results also provided support for further development and application of metabonomics technologies for studying SCI and for the utilization of multivariate models for classifying the extent of trauma within an individual.

  13. Contextual analysis of immunological response through whole-organ fluorescent imaging.

    PubMed

    Woodruff, Matthew C; Herndon, Caroline N; Heesters, B A; Carroll, Michael C

    2013-09-01

    As fluorescent microscopy has developed, significant insights have been gained into the establishment of immune response within secondary lymphoid organs, particularly in draining lymph nodes. While established techniques such as confocal imaging and intravital multi-photon microscopy have proven invaluable, they provide limited insight into the architectural and structural context in which these responses occur. To interrogate the role of the lymph node environment in immune response effectively, a new set of imaging tools taking into account broader architectural context must be implemented into emerging immunological questions. Using two different methods of whole-organ imaging, optical clearing and three-dimensional reconstruction of serially sectioned lymph nodes, fluorescent representations of whole lymph nodes can be acquired at cellular resolution. Using freely available post-processing tools, images of unlimited size and depth can be assembled into cohesive, contextual snapshots of immunological response. Through the implementation of robust iterative analysis techniques, these highly complex three-dimensional images can be objectified into sortable object data sets. These data can then be used to interrogate complex questions at the cellular level within the broader context of lymph node biology. By combining existing imaging technology with complex methods of sample preparation and capture, we have developed efficient systems for contextualizing immunological phenomena within lymphatic architecture. In combination with robust approaches to image analysis, these advances provide a path to integrating scientific understanding of basic lymphatic biology into the complex nature of immunological response.

  14. Protein identification and quantification from riverbank grape, Vitis riparia: Comparing SDS-PAGE and FASP-GPF techniques for shotgun proteomic analysis.

    PubMed

    George, Iniga S; Fennell, Anne Y; Haynes, Paul A

    2015-09-01

    Protein sample preparation optimisation is critical for establishing reproducible high throughput proteomic analysis. In this study, two different fractionation sample preparation techniques (in-gel digestion and in-solution digestion) for shotgun proteomics were used to quantitatively compare proteins identified in Vitis riparia leaf samples. The total number of proteins and peptides identified were compared between filter aided sample preparation (FASP) coupled with gas phase fractionation (GPF) and SDS-PAGE methods. There was a 24% increase in the total number of reproducibly identified proteins when FASP-GPF was used. FASP-GPF is more reproducible, less expensive and a better method than SDS-PAGE for shotgun proteomics of grapevine samples as it significantly increases protein identification across biological replicates. Total peptide and protein information from the two fractionation techniques is available in PRIDE with the identifier PXD001399 (http://proteomecentral.proteomexchange.org/dataset/PXD001399). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Synthesis, structural, optical, thermal and dielectric studies on new organic nonlinear optical crystal by solution growth technique.

    PubMed

    Prakash, M; Geetha, D; Lydia Caroline, M

    2013-04-15

    Single crystals of L-phenylalanine-benzoic acid (LPBA) were successfully grown from aqueous solution by solvent evaporation technique. Purity of the crystals was increased by the method of recrystallization. The XRD analysis confirms that the crystal belongs to the monoclinic system with noncentrosymmetric space group P21. The chemical structure of compound was established by FT-NMR technique. The presence of functional groups was estimated qualitatively by Fourier transform infrared analysis (FT-IR). Ultraviolet-visible spectral analyses showed that the crystal has low UV cut-off at 254 nm combined with very good transparency of 90% in a wide range. The optical band gap was estimated to be 6.91 eV. Thermal behavior has been studied with TGA/DTA analyses. The existence of second harmonic generation (SHG) efficiency was found to be 0.56 times the value of KDP. The dielectric behavior of the sample was also studied for the first time. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Whole-Body Human Inverse Dynamics with Distributed Micro-Accelerometers, Gyros and Force Sensing †

    PubMed Central

    Latella, Claudia; Kuppuswamy, Naveen; Romano, Francesco; Traversaro, Silvio; Nori, Francesco

    2016-01-01

    Human motion tracking is a powerful tool used in a large range of applications that require human movement analysis. Although it is a well-established technique, its main limitation is the lack of estimation of real-time kinetics information such as forces and torques during the motion capture. In this paper, we present a novel approach for a human soft wearable force tracking for the simultaneous estimation of whole-body forces along with the motion. The early stage of our framework encompasses traditional passive marker based methods, inertial and contact force sensor modalities and harnesses a probabilistic computational technique for estimating dynamic quantities, originally proposed in the domain of humanoid robot control. We present experimental analysis on subjects performing a two degrees-of-freedom bowing task, and we estimate the motion and kinetics quantities. The results demonstrate the validity of the proposed method. We discuss the possible use of this technique in the design of a novel soft wearable force tracking device and its potential applications. PMID:27213394

  17. Theonellapeptolide IIIe, a new cyclic peptolide from the New Zealand deep water sponge, Lamellomorpha strongylata.

    PubMed

    Li, S; Dumdei, E J; Blunt, J W; Munro, M H; Robinson, W T; Pannell, L K

    1998-06-26

    The structure, stereochemistry, and conformation of theonellapeptolide IIIe (1), a new 36-membered ring cyclic peptolide from the New Zealand deep-water sponge Lamellomorpha strongylata, is described. The sequence of the cytotoxic peptolide was determined through a combination of NMR and MS-MS techniques and confirmed by X-ray crystal structure analysis, which, with chiral HPLC, established the absolute stereochemistry.

  18. Experimental evaluation of a new morphological approximation of the articular surfaces of the ankle joint.

    PubMed

    Belvedere, Claudio; Siegler, Sorin; Ensini, Andrea; Toy, Jason; Caravaggi, Paolo; Namani, Ramya; Giannini, Giulia; Durante, Stefano; Leardini, Alberto

    2017-02-28

    The mechanical characteristics of the ankle such as its kinematics and load transfer properties are influenced by the geometry of the articulating surfaces. A recent, image-based study found that these surfaces can be approximated by a saddle-shaped, skewed, truncated cone with its apex oriented laterally. The goal of this study was to establish a reliable experimental technique to study the relationship between the geometry of the articular surfaces of the ankle and its mobility and stability characteristics and to use this technique to determine if morphological approximations of the ankle surfaces based on recent discoveries, produce close to normal behavior. The study was performed on ten cadavers. For each specimen, a process based on medical imaging, modeling and 3D printing was used to produce two subject specific artificial implantable sets of the ankle surfaces. One set was a replica of the natural surfaces. The second approximated the ankle surfaces as an original saddle-shaped truncated cone with apex oriented laterally. Testing under cyclic loading conditions was then performed on each specimen following a previously established technique to determine its mobility and stability characteristics under three different conditions: natural surfaces; artificial surfaces replicating the natural surface morphology; and artificial approximation based on the saddle-shaped truncated cone concept. A repeated measure analysis of variance was then used to compare between the three conditions. The results show that (1): the artificial surfaces replicating natural morphology produce close to natural mobility and stability behavior thus establishing the reliability of the technique; and (2): the approximated surfaces based on saddle-shaped truncated cone concept produce mobility and stability behavior close to the ankle with natural surfaces. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Fan fault diagnosis based on symmetrized dot pattern analysis and image matching

    NASA Astrophysics Data System (ADS)

    Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling

    2016-07-01

    To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.

  20. Operational experience in underwater photogrammetry

    NASA Astrophysics Data System (ADS)

    Leatherdale, John D.; John Turner, D.

    Underwater photogrammetry has become established as a cost-effective technique for inspection and maintenance of platforms and pipelines for the offshore oil industry. A commercial service based in Scotland operates in the North Sea, USA, Brazil, West Africa and Australia. 70 mm cameras and flash units are built for the purpose and analytical plotters and computer graphics systems are used for photogrammetric measurement and analysis of damage, corrosion, weld failures and redesign of underwater structures. Users are seeking simple, low-cost systems for photogrammetric analysis which their engineers can use themselves.

  1. PCR-based Analysis of Microbial Communities in Extreme Environment: Results from EuroGeoMars MDRS campaign

    NASA Astrophysics Data System (ADS)

    Thiel, C.; Wills, D.; Foing, B.; Wadham, J.; Cullen, D.; van Sluis, C.

    2009-04-01

    Deoxyribonucleic acid (DNA) is found in almost all living organisms. The main function of DNA molecules is the long-term storage of genetic information.They are passed on from generation to generation as the hereditary material. This molecular structure is often compared to a genetic blueprint, a fingerprint, which is unique for each organism and can therefore be used as a mean of identification. In 1984 a revolutionary technique called polymerase chain reaction (PCR) was established, able to amplify a single or few copies of DNA molecules across several orders of magnitude, generating millions of copies of the original DNA fragment. PCR is nowadays a common technique used in medical and biological research laboratories for a large variety of applications like functional analysis of genes, DNA-based phylogeny, diagnosis of hereditary diseases, detection and diagnosis of infectious diseases, and identification of genetic fingerprints. This powerful tool gives us the opportunity to investigate, if there is or was life on Mars since DNA fragments are highly stable what allows not only amplification from living organisms but also from samples with an age of several thousand years. If we assume that micro-organisms were exchanged between Mars and Earth via meteorites, it is imaginable that Martian life might also be based on DNA as carrier of genetic information. Therefore our goal is to establish a routine for detection of DNA from micro-organisms based on the effective but also robust and simple PCR technique, demonstrated during the EuroGeoMars simulation campaign at Mars Desert Research Station (MDRS). We have already analysed some MDRS soil samples at ESTEC ExoGeoLab facility. During the MDRS simulation we will show that it is possible to establish a minimal molecular biology lab in the habitat for an immediate on site analysis by PCR after sample collection. Samples will be taken from different locations and soil depths. The sample analysis will start immediately after returning to the habitat and will be finished during the following days. DNA will be isolated from micro-organisms by Powersoil DNA isolation kit and serves as template for PCR using oligonucleotides specific for ribosomal DNA to identify representatives of the different groups of micro-organisms: archaea, bacteria and eukaryotes. PCR products will be analysed by agarose gel electrophoresis and documented via UV-trans-illuminator and digital camera.

  2. Network meta-analysis: an introduction for pharmacists.

    PubMed

    Xu, Yina; Amiche, Mohamed Amine; Tadrous, Mina

    2018-05-21

    Network meta-analysis is a new tool used to summarize and compare studies for multiple interventions, irrespective of whether these interventions have been directly evaluated against each other. Network meta-analysis is quickly becoming the standard in conducting therapeutic reviews and clinical guideline development. However, little guidance is available to help pharmacists review network meta-analysis studies in their practice. Major institutions such as the Cochrane Collaboration, Agency for Healthcare Research and Quality, Canadian Agency for Drugs and Technologies in Health, and National Institute for Health and Care Excellence Decision Support Unit have endorsed utilizing network meta-analysis to establish therapeutic evidence and inform decision making. Our objective is to introduce this novel technique to pharmacy practitioners, and highlight key assumptions behind network meta-analysis studies.

  3. Synchronous in-field application of life-detection techniques in planetary analog missions

    NASA Astrophysics Data System (ADS)

    Amador, Elena S.; Cable, Morgan L.; Chaudry, Nosheen; Cullen, Thomas; Gentry, Diana; Jacobsen, Malene B.; Murukesan, Gayathri; Schwieterman, Edward W.; Stevens, Adam H.; Stockton, Amanda; Yin, Chang; Cullen, David C.; Geppert, Wolf

    2015-02-01

    Field expeditions that simulate the operations of robotic planetary exploration missions at analog sites on Earth can help establish best practices and are therefore a positive contribution to the planetary exploration community. There are many sites in Iceland that possess heritage as planetary exploration analog locations and whose environmental extremes make them suitable for simulating scientific sampling and robotic operations. We conducted a planetary exploration analog mission at two recent lava fields in Iceland, Fimmvörðuháls (2010) and Eldfell (1973), using a specially developed field laboratory. We tested the utility of in-field site sampling down selection and tiered analysis operational capabilities with three life detection and characterization techniques: fluorescence microscopy (FM), adenine-triphosphate (ATP) bioluminescence assay, and quantitative polymerase chain reaction (qPCR) assay. The study made use of multiple cycles of sample collection at multiple distance scales and field laboratory analysis using the synchronous life-detection techniques to heuristically develop the continuing sampling and analysis strategy during the expedition. Here we report the operational lessons learned and provide brief summaries of scientific data. The full scientific data report will follow separately. We found that rapid in-field analysis to determine subsequent sampling decisions is operationally feasible, and that the chosen life detection and characterization techniques are suitable for a terrestrial life-detection field mission. In-field analysis enables the rapid obtainment of scientific data and thus facilitates the collection of the most scientifically relevant samples within a single field expedition, without the need for sample relocation to external laboratories. The operational lessons learned in this study could be applied to future terrestrial field expeditions employing other analytical techniques and to future robotic planetary exploration missions.

  4. Establishment of IDF-curves for precipitation in the tropical area of Central Africa - comparison of techniques and results

    NASA Astrophysics Data System (ADS)

    Mohymont, B.; Demarée, G. R.; Faka, D. N.

    2004-05-01

    The establishment of Intensity-Duration-Frequency (IDF) curves for precipitation remains a powerful tool in the risk analysis of natural hazards. Indeed the IDF-curves allow for the estimation of the return period of an observed rainfall event or conversely of the rainfall amount corresponding to a given return period for different aggregation times. There is a high need for IDF-curves in the tropical region of Central Africa but unfortunately the adequate long-term data sets are frequently not available. The present paper assesses IDF-curves for precipitation for three stations in Central Africa. More physically based models for the IDF-curves are proposed. The methodology used here has been advanced by Koutsoyiannis et al. (1998) and an inter-station and inter-technique comparison is being carried out. The IDF-curves for tropical Central Africa are an interesting tool to be used in sewer system design to combat the frequently occurring inundations in semi-urbanized and urbanized areas of the Kinshasa megapolis.

  5. [The implementation of polymerase chain reaction technique: the real time to reveal and differentiate the viruses of human papilloma of high carcinogenic risk].

    PubMed

    Andosova, L D; Kontorshchikova, K N; Blatova, O L; Kudel'kina, S Iu; Kuznetsova, I A; Belov, A V; Baĭkova, R A

    2011-07-01

    The polymerase chain reaction technique was applied in "real time" format to evaluate the occurrence rate and infection ratio of various genotypes of human papilloma of high carcinogenic risk in virus-positive women and contact persons. The examination sampling consisted of 738 women aged of 17-50 years. The examination results permitted to establish high percentage of infection of 546 patients (74%) by carcinogenic papilloma viruses. The analysis of detection rate of various genotypes of human papilloma of high carcinogenic risk established that the 56th and 16th types of high carcinogenic risk are revealed more often than others--in 33% and 15.4% correspondingly. In males, first place in occurrence rate is for those types of virus of human papilloma: the 56th n = 10 (33.3%), 16th n = 3 (10%), 45th n = 3 (10%), 51th n = 3 (10%). The rest of genotypes are detected in 3-7% cases.

  6. [The Morbidity of Students Conditioned by Diet Character in Modern Condition of Education].

    PubMed

    Novokhatskaya, E A; Yakovleva, T P; Kalitina, M A

    2017-09-01

    The article considers characteristics of nervous psychic adaptation, morbidity and character of diet of students of the Russian state social university. The main incentives of combination of university studies and work are analyzed. The impact of combining of studies and work, regimen and diet quality on health are investigated. The psychological studies were implemented using computerized techniques of psychological testing and data collection with blank technique. The morbidity of students was discovered using questionnaire. It is established that students combining studies and work, have optimal indices of nervous psychic adaptation. however, level of their morbidity is twice higher than morbidity of students not combining studies and work. The analysis of regimen and diet character of students established deviations in regimen and structure of diet. The ration of proteins, fats and carbohydrates in day ration of students was imbalanced (1.0:1.4:6.1) at the expense of surplus of content of fat and especially carbohydrates that afterwards can results in development of diseases related to irregular diet.

  7. Operational considerations for the application of remotely sensed forest data from LANDSAT or other airborne platforms

    NASA Technical Reports Server (NTRS)

    Baker, G. R.; Fethe, T. P.

    1975-01-01

    Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.

  8. Extrinsic regime shifts drive abrupt changes in regeneration dynamics at upper treeline in the Rocky Mountains, U.S.A.

    PubMed

    Elliott, Grant P

    2012-07-01

    Given the widespread and often dramatic influence of climate change on terrestrial ecosystems, it is increasingly common for abrupt threshold changes to occur, yet explicitly testing for climate and ecological regime shifts is lacking in climatically sensitive upper treeline ecotones. In this study, quantitative evidence based on empirical data is provided to support the key role of extrinsic, climate-induced thresholds in governing the spatial and temporal patterns of tree establishment in these high-elevation environments. Dendroecological techniques were used to reconstruct a 420-year history of regeneration dynamics within upper treeline ecotones along a latitudinal gradient (approximately 44-35 degrees N) in the Rocky Mountains. Correlation analysis was used to assess the possible influence of minimum and maximum temperature indices and cool-season (November-April) precipitation on regional age-structure data. Regime-shift analysis was used to detect thresholds in tree establishment during the entire period of record (1580-2000), temperature variables significantly Correlated with establishment during the 20th century, and cool-season precipitation. Tree establishment was significantly correlated with minimum temperature during the spring (March-May) and cool season. Regime-shift analysis identified an abrupt increase in regional tree establishment in 1950 (1950-1954 age class). Coincident with this period was a shift toward reduced cool-season precipitation. The alignment of these climate conditions apparently triggered an abrupt increase in establishment that was unprecedented during the period of record. Two main findings emerge from this research that underscore the critical role of climate in governing regeneration dynamics within upper treeline ecotones. (1) Regional climate variability is capable of exceeding bioclimatic thresholds, thereby initiating synchronous and abrupt changes in the spatial and temporal patterns of tree establishment at broad regional scales. (2) The importance of climate parameters exceeding critical threshold values and triggering a regime shift in tree establishment appears to be contingent on the alignment of favorable temperature and moisture regimes. This research suggests that threshold changes in the climate system can fundamentally alter regeneration dynamics within upper treeline ecotones and, through the use of regime-shift analysis, reveals important climate-vegetation linkages.

  9. Relationships between autofocus methods for SAR and self-survey techniques for SONAR. [Synthetic Aperture Radar (SAR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, D.E.; Jakowatz, C.V. Jr.; Ghiglia, D.C.

    1991-01-01

    Autofocus methods in SAR and self-survey techniques in SONAR have a common mathematical basis in that they both involve estimation and correction of phase errors introduced by sensor position uncertainties. Time delay estimation and correlation methods have been shown to be effective in solving the self-survey problem for towed SONAR arrays. Since it can be shown that platform motion errors introduce similar time-delay estimation problems in SAR imaging, the question arises as to whether such techniques could be effectively employed for autofocus of SAR imagery. With a simple mathematical model for motion errors in SAR, we will show why suchmore » correlation/time-delay techniques are not nearly as effective as established SAR autofocus algorithms such as phase gradient autofocus or sub-aperture based methods. This analysis forms an important bridge between signal processing methodologies for SAR and SONAR. 5 refs., 4 figs.« less

  10. Strain Gauge Balance Uncertainty Analysis at NASA Langley: A Technical Review

    NASA Technical Reports Server (NTRS)

    Tripp, John S.

    1999-01-01

    This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.

  11. Safer Liquid Natural Gas

    NASA Technical Reports Server (NTRS)

    1976-01-01

    After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).

  12. Slow crack growth in spinel in water

    NASA Technical Reports Server (NTRS)

    Schwantes, S.; Elber, W.

    1983-01-01

    Magnesium aluminate spinel was tested in a water environment at room temperature to establish its slow crack-growth behavior. Ring specimens with artificial flaws on the outside surface were loaded hydraulically on the inside surface. The time to failure was measured. Various precracking techniques were evaluated and multiple precracks were used to minimize the scatter in the static fatigue tests. Statistical analysis techniques were developed to determine the strength and crack velocities for a single flaw. Slow crack-growth rupture was observed at stress intensities as low as 70 percent of K sub c. A strengthening effect was observed in specimens that had survived long-time static fatigue tests.

  13. Monitoring microcirculation.

    PubMed

    Ocak, Işık; Kara, Atila; Ince, Can

    2016-12-01

    The clinical relevance of microcirculation and its bedside observation started gaining importance in the 1990s since the introduction of hand-held video microscopes. From then, this technology has been continuously developed, and its clinical relevance has been established in more than 400 studies. In this paper, we review the different types of video microscopes, their application techniques, the microcirculation of different organ systems, the analysis methods, and the software and scoring systems. The main focus of this review will be on the state-of-art technique, CytoCam-incident dark-field imaging, and the most recent technological and technical updates concerning microcirculation monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Laser scanning confocal microscopy: history, applications, and related optical sectioning techniques.

    PubMed

    Paddock, Stephen W; Eliceiri, Kevin W

    2014-01-01

    Confocal microscopy is an established light microscopical technique for imaging fluorescently labeled specimens with significant three-dimensional structure. Applications of confocal microscopy in the biomedical sciences include the imaging of the spatial distribution of macromolecules in either fixed or living cells, the automated collection of 3D data, the imaging of multiple labeled specimens and the measurement of physiological events in living cells. The laser scanning confocal microscope continues to be chosen for most routine work although a number of instruments have been developed for more specific applications. Significant improvements have been made to all areas of the confocal approach, not only to the instruments themselves, but also to the protocols of specimen preparation, to the analysis, the display, the reproduction, sharing and management of confocal images using bioinformatics techniques.

  15. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  16. Analysis of population structures of the microalga Acutodesmus obliquus during lipid production using multi-dimensional single-cell analysis.

    PubMed

    Sandmann, Michael; Schafberg, Michaela; Lippold, Martin; Rohn, Sascha

    2018-04-19

    Microalgae bear a great potential to produce lipids for biodiesel, feed, or even food applications. To understand the still not well-known single-cell dynamics during lipid production in microalgae, a novel single-cell analytical technology was applied to study a well-established model experiment. Multidimensional single-cell dynamics were investigated with a non-supervised image analysis technique that utilizes data from epi-fluorescence microscopy. Reliability of this technique was successfully proven via reference analysis. The technique developed was used to determine cell size, chlorophyll amount, neutral lipid amount, and deriving properties on a single-cellular level in cultures of the biotechnologically promising alga Acutodesmus obliquus. The results illustrated a high correlation between cell size and chlorophyll amount, but a very low and dynamic correlation between cell size, lipid amount, and lipid density. During growth conditions under nitrogen starvation, cells with low chlorophyll content tend to start the lipid production first and the cell suspension differentiated in two subpopulations with significantly different lipid contents. Such quantitative characterization of single-cell dynamics of lipid synthesizing algae was done for the first time and the potential of such simple technology is highly relevant to other biotechnological applications and to deeper investigate the process of microalgal lipid accumulation.

  17. Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.

    PubMed

    Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline

    2017-01-01

    Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.

  18. Fabrication of thermal-resistant gratings for high-temperature measurements using geometric phase analysis.

    PubMed

    Zhang, Q; Liu, Z; Xie, H; Ma, K; Wu, L

    2016-12-01

    Grating fabrication techniques are crucial to the success of grating-based deformation measurement methods because the quality of the grating will directly affect the measurement results. Deformation measurements at high temperatures entail heating and, perhaps, oxidize the grating. The contrast of the grating lines may change during the heating process. Thus, the thermal-resistant capability of the grating becomes a point of great concern before taking measurements. This study proposes a method that combines a laser-engraving technique with the processes of particle spraying and sintering for fabricating thermal-resistant gratings. The grating fabrication technique is introduced and discussed in detail. A numerical simulation with a geometric phase analysis (GPA) is performed for a homogeneous deformation case. Then, the selection scheme of the grating pitch is suggested. The validity of the proposed technique is verified by fabricating a thermal-resistant grating on a ZrO 2 specimen and measuring its thermal strain at high temperatures (up to 1300 °C). Images of the grating before and after deformation are used to obtain the thermal-strain field by GPA and to compare the results with well-established reference data. The experimental results indicate that this proposed technique is feasible and will offer good prospects for further applications.

  19. TELEVISION ADVERTISING OF SELECTED MEDICINAL PRODUCTS IN POLAND AND IN THE UNITED STATES - A COMPARATIVE ANALYSIS OF SELECTED TELEVISION COMMERCIALS.

    PubMed

    Wiśniewska, Ewa; Czerw, Aleksandra; Makowska, Marta; Fronczak, Adam

    2016-07-01

    The aim of the analysis was to establish the differences between television commercials of OTC drugs broadcast in Poland and in the U.S. The study covered 100 commercials of medicinal products of various producers applied to treat a variety of symptoms and diseases. The analysis demonstrated that there are both similarities and differences. The differences concerned e.g., spot length, the time of placement of a brand name and the diversity of advertising slogans. The most significant similarities concerned applied manipulation techniques, locations featured in commercials and the choice of actors.

  20. The Fifth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Fifth Annual Thermal and Fluids Analysis Workshop was held at the Ohio Aerospace Institute, Brook Park, Ohio, cosponsored by NASA Lewis Research Center and the Ohio Aerospace Institute, 16-20 Aug. 1993. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluid analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  1. Application of advanced techniques for the assessment of bio-stability of biowaste-derived residues: A minireview.

    PubMed

    Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing

    2018-01-01

    Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. New Micro-Method for Prediction of Vapor Pressure of Energetic Materials

    DTIC Science & Technology

    2014-07-01

    temperature is recorded as the extrapolated onset temperature (11–12). • Gas chromatography (GC) headspace analysis requires the establishment of an...J. L.; Shinde, K.; Moran, J. Determination of the Vapor Density of Triacetone Triperoxide (TATP) Using a Gas Chromatography Headspace Technique...Propellants Explos. Pyrotech. 2005, 30 (2), 127–30. 14. Chickos, J. S. Sublimation Vapor Pressures as Evaluated by Correlation- Gas Chromatography . J

  3. Stability analysis of a Vlasov-Wave system describing particles interacting with their environment

    NASA Astrophysics Data System (ADS)

    De Bièvre, Stephan; Goudon, Thierry; Vavasseur, Arthur

    2018-06-01

    We study a kinetic equation of the Vlasov-Wave type, which arises in the description of the behavior of a large number of particles interacting weakly with an environment, composed of an infinite collection of local vibrational degrees of freedom, modeled by wave equations. We use variational techniques to establish the existence of large families of stationary states for this system, and analyze their stability.

  4. Brainstorm: A User-Friendly Application for MEG/EEG Analysis

    PubMed Central

    Tadel, François; Baillet, Sylvain; Mosher, John C.; Pantazis, Dimitrios; Leahy, Richard M.

    2011-01-01

    Brainstorm is a collaborative open-source application dedicated to magnetoencephalography (MEG) and electroencephalography (EEG) data visualization and processing, with an emphasis on cortical source estimation techniques and their integration with anatomical magnetic resonance imaging (MRI) data. The primary objective of the software is to connect MEG/EEG neuroscience investigators with both the best-established and cutting-edge methods through a simple and intuitive graphical user interface (GUI). PMID:21584256

  5. Status of Fuel Development and Manufacturing for Space Nuclear Reactors at BWX Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, W.J.; Husser, D.L.; Mohr, T.C.

    2004-02-04

    New advanced nuclear space propulsion systems will soon seek a high temperature, stable fuel form. BWX Technologies Inc (BWXT) has a long history of fuel manufacturing. UO2, UCO, and UCx have been fabricated at BWXT for various US and international programs. Recent efforts at BWXT have focused on establishing the manufacturing techniques and analysis capabilities needed to provide a high quality, high power, compact nuclear reactor for use in space nuclear powered missions. To support the production of a space nuclear reactor, uranium nitride has recently been manufactured by BWXT. In addition, analytical chemistry and analysis techniques have been developedmore » to provide verification and qualification of the uranium nitride production process. The fabrication of a space nuclear reactor will require the ability to place an unclad fuel form into a clad structure for assembly into a reactor core configuration. To this end, BWX Technologies has reestablished its capability for machining, GTA welding, and EB welding of refractory metals. Specifically, BWX Technologies has demonstrated GTA welding of niobium flat plate and EB welding of niobium and Nb-1Zr tubing. In performing these demonstration activities, BWX Technologies has established the necessary infrastructure to manufacture UO2, UCx, or UNx fuel, components, and complete reactor assemblies in support of space nuclear programs.« less

  6. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  7. Micromechanical Characterization and Texture Analysis of Direct Cast Titanium Alloys Strips

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This research was conducted to determine a post-processing technique to optimize mechanical and material properties of a number of Titanium based alloys and aluminides processed via Melt Overflow Solidification Technique (MORST). This technique was developed by NASA for the development of thin sheet titanium and titanium aluminides used in high temperature applications. The materials investigated in this study included conventional titanium alloy strips and foils, Ti-1100, Ti-24Al-11Nb (Alpha-2), and Ti-48Al-2Ta (Gamma). The methodology used included micro-characterization, heat-treatment, mechanical processing and mechanical testing. Characterization techniques included optical, electron microscopy, and x-ray texture analysis. The processing included heat-treatment and mechanical deformation through cold rolling. The initial as-cast materials were evaluated for their microstructure and mechanical properties. Different heat-treatment and rolling steps were chosen to process these materials. The properties were evaluated further and a processing relationship was established in order to obtain an optimum processing condition. The results showed that the as-cast material exhibited a Widmanstatten (fine grain) microstructure that developed into a microstructure with larger grains through processing steps. The texture intensity showed little change for all processing performed in this investigation.

  8. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  9. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures

    NASA Astrophysics Data System (ADS)

    Saleh, Sarah S.; Lotfy, Hayam M.; Hassan, Nagiba Y.; Salem, Hesham

    2014-11-01

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision.

  10. Identification of pesticide varieties by detecting characteristics of Chlorella pyrenoidosa using Visible/Near infrared hyperspectral imaging and Raman microspectroscopy technology.

    PubMed

    Shao, Yongni; Li, Yuan; Jiang, Linjun; Pan, Jian; He, Yong; Dou, Xiaoming

    2016-11-01

    The main goal of this research is to examine the feasibility of applying Visible/Near-infrared hyperspectral imaging (Vis/NIR-HSI) and Raman microspectroscopy technology for non-destructive identification of pesticide varieties (glyphosate and butachlor). Both mentioned technologies were explored to investigate how internal elements or characteristics of Chlorella pyrenoidosa change when pesticides are applied, and in the meantime, to identify varieties of the pesticides during this procedure. Successive projections algorithm (SPA) was introduced to our study to identify seven most effective wavelengths. With those wavelengths suggested by SPA, a model of the linear discriminant analysis (LDA) was established to classify the pesticide varieties, and the correct classification rate of the SPA-LDA model reached as high as 100%. For the Raman technique, a few partial least squares discriminant analysis models were established with different preprocessing methods from which we also identified one processing approach that achieved the most optimal result. The sensitive wavelengths (SWs) which are related to algae's pigment were chosen, and a model of LDA was established with the correct identification reached a high level of 90.0%. The results showed that both Vis/NIR-HSI and Raman microspectroscopy techniques are capable to identify pesticide varieties in an indirect but effective way, and SPA is an effective wavelength extracting method. The SWs corresponding to microalgae pigments, which were influenced by pesticides, could also help to characterize different pesticide varieties and benefit the variety identification. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A methodology for commonality analysis, with applications to selected space station systems

    NASA Technical Reports Server (NTRS)

    Thomas, Lawrence Dale

    1989-01-01

    The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.

  12. DXS106 and DXSW559 flank the X-linked dystonia-parkisonism syndrome locus (DYT3)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, U.; Haberhausen, G.; Wagner, T.

    1994-09-01

    The locus (DYT3) underlying the X-linked dystonia-parkinsonism syndrome (XDP) was delineated within proximal Xq12-Xq13.1 by analysis of linkage, allelic association, and haplotypes. Short tandem repeat polymorphism at loci DXS227, DXS559, DXS453, DXS106, DXS339, and DXS135 were studied. The occurrence of a recombination within a three-generation family established DXS559 as the distal flanking marker of DYT3. /{phi}/and /{Delta}/ values were determined as indicators of the degree of allelic association between DYT3 and the six marker loci. In addition, haplotype analysis was performed at the loci studied. The findings establish DXS106 as the proximal flanking marker of DYT3. Given an approximate distancemore » between DXS106 and DXS559 of 3.0 Mb, isolation of DYT3 is now feasible by positional cloning techniques. 21 refs., 2 figs., 3 tabs.« less

  13. Improved production of transgenic Dioscorea zingiberensis (Dioscoreaceae) by Agrobacterium tumefaciens-mediated transformation.

    PubMed

    Shi, L; Fan, J Q; Hu, C G; Luo, J; Yao, J L

    2012-02-03

    The establishment of high-efficiency Agrobacterium-mediated transformation techniques could improve the production of Dioscorea zingiberensis, a medicinal species with a high diosgenin content. We co-cultivated embryogenic calli induced from mature seeds with A. tumefaciens strain EHA105. A binary vector, pCAMBIA1381, which contains the gfp and hpt genes under the control of the ubiquitin promoter and the CaMV 35S promoter, respectively, was used for transformation. Pre-culture, basic medium, acetosyringone, and bacterial density were evaluated to establish the most efficient protocol. The optimal conditions consisted of MS medium without CaCl(2) for pre- and co-cultivation, three days for pre-culture, addition of 200 μM AS, and an OD(600) of 0.5. The transgenic plants grown under selection were confirmed by PCR analysis and Southern blot analysis. This protocol produced transgenic D. zingiberensis plants in seven months, with a transformation efficiency of 6%.

  14. Evaluation of massively parallel sequencing for forensic DNA methylation profiling.

    PubMed

    Richards, Rebecca; Patel, Jayshree; Stevenson, Kate; Harbison, SallyAnn

    2018-05-11

    Epigenetics is an emerging area of interest in forensic science. DNA methylation, a type of epigenetic modification, can be applied to chronological age estimation, identical twin differentiation and body fluid identification. However, there is not yet an agreed, established methodology for targeted detection and analysis of DNA methylation markers in forensic research. Recently a massively parallel sequencing-based approach has been suggested. The use of massively parallel sequencing is well established in clinical epigenetics and is emerging as a new technology in the forensic field. This review investigates the potential benefits, limitations and considerations of this technique for the analysis of DNA methylation in a forensic context. The importance of a robust protocol, regardless of the methodology used, that minimises potential sources of bias is highlighted. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Determination of thickness of thin turbid painted over-layers using micro-scale spatially offset Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Conti, Claudia; Realini, Marco; Colombo, Chiara; Botteon, Alessandra; Bertasa, Moira; Striova, Jana; Barucci, Marco; Matousek, Pavel

    2016-12-01

    We present a method for estimating the thickness of thin turbid layers using defocusing micro-spatially offset Raman spectroscopy (micro-SORS). The approach, applicable to highly turbid systems, enables one to predict depths in excess of those accessible with conventional Raman microscopy. The technique can be used, for example, to establish the paint layer thickness on cultural heritage objects, such as panel canvases, mural paintings, painted statues and decorated objects. Other applications include analysis in polymer, biological and biomedical disciplines, catalytic and forensics sciences where highly turbid overlayers are often present and where invasive probing may not be possible or is undesirable. The method comprises two stages: (i) a calibration step for training the method on a well characterized sample set with a known thickness, and (ii) a prediction step where the prediction of layer thickness is carried out non-invasively on samples of unknown thickness of the same chemical and physical make up as the calibration set. An illustrative example of a practical deployment of this method is the analysis of larger areas of paintings. In this case, first, a calibration would be performed on a fragment of painting of a known thickness (e.g. derived from cross-sectional analysis) and subsequently the analysis of thickness across larger areas of painting could then be carried out non-invasively. The performance of the method is compared with that of the more established optical coherence tomography (OCT) technique on identical sample set. This article is part of the themed issue "Raman spectroscopy in art and archaeology".

  16. Probing the self-assembled nanostructures of functional polymers with synchrotron grazing incidence X-ray scattering.

    PubMed

    Ree, Moonhor

    2014-05-01

    For advanced functional polymers such as biopolymers, biomimic polymers, brush polymers, star polymers, dendritic polymers, and block copolymers, information about their surface structures, morphologies, and atomic structures is essential for understanding their properties and investigating their potential applications. Grazing incidence X-ray scattering (GIXS) is established for the last 15 years as the most powerful, versatile, and nondestructive tool for determining these structural details when performed with the aid of an advanced third-generation synchrotron radiation source with high flux, high energy resolution, energy tunability, and small beam size. One particular merit of this technique is that GIXS data can be obtained facilely for material specimens of any size, type, or shape. However, GIXS data analysis requires an understanding of GIXS theory and of refraction and reflection effects, and for any given material specimen, the best methods for extracting the form factor and the structure factor from the data need to be established. GIXS theory is reviewed here from the perspective of practical GIXS measurements and quantitative data analysis. In addition, schemes are discussed for the detailed analysis of GIXS data for the various self-assembled nanostructures of functional homopolymers, brush, star, and dendritic polymers, and block copolymers. Moreover, enhancements to the GIXS technique are discussed that can significantly improve its structure analysis by using the new synchrotron radiation sources such as third-generation X-ray sources with picosecond pulses and partial coherence and fourth-generation X-ray laser sources with femtosecond pulses and full coherence. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Acousto-optic signature analysis for inspection of the orbiter thermal protection tile bonds

    NASA Technical Reports Server (NTRS)

    Rodriguez, Julio G.; Tow, D. M.; Barna, B. A.

    1990-01-01

    The goal of this research is to develop a viable NDE technique for the inspection of orbiter thermal protection system (TPS) tile bonds. Phase 2, discussed here, concentrated on developing an empirical understanding of the bonded and unbonded vibration signatures of acreage tiles. Controlled experiments in the laboratory have provided useful information on the dynamic response of TPS tiles. It has been shown that several signatures are common to all the pedigree tiles. This degree of consistency in the tile-SIP (strain isolation pad) dynamic response proves that an unbond can be detected for a known tile and establish the basis for extending the analysis capability to arbitrary tiles for which there are no historical data. The field tests of the noncontacting laser acoustic sensor system, conducted at the Kennedy Space Center (KSC), investigated the vibrational environment of the Orbiter Processing Facility (OPF) and its effect on the measurement and analysis techniques being developed. The data collected showed that for orbiter locations, such as the body flap and elevon, the data analysis scheme, and/or the sensor, will require modification to accommodate the ambient motion. Several methods were identified for accomplishing this, and a solution is seen as readily achievable. It was established that the tile response was similar to that observed in the laboratory. Of most importance, however, is that the field environment will not affect the physics of the dynamic response that is related to bond condition. All of this information is fundamental to any future design and development of a prototype system.

  18. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  19. In vivo stationary flux analysis by 13C labeling experiments.

    PubMed

    Wiechert, W; de Graaf, A A

    1996-01-01

    Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.

  20. MEG-SIM: a web portal for testing MEG analysis methods using realistic simulated and empirical data.

    PubMed

    Aine, C J; Sanfratello, L; Ranken, D; Best, E; MacArthur, J A; Wallace, T; Gilliam, K; Donahue, C H; Montaño, R; Bryant, J E; Scott, A; Stephen, J M

    2012-04-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes ( http://cobre.mrn.org/megsim/ ). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis.

  1. MEG-SIM: A Web Portal for Testing MEG Analysis Methods using Realistic Simulated and Empirical Data

    PubMed Central

    Aine, C. J.; Sanfratello, L.; Ranken, D.; Best, E.; MacArthur, J. A.; Wallace, T.; Gilliam, K.; Donahue, C. H.; Montaño, R.; Bryant, J. E.; Scott, A.; Stephen, J. M.

    2012-01-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes (http://cobre.mrn.org/megsim/). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis. PMID:22068921

  2. Analysis of the differentially expressed low molecular weight peptides in human serum via an N-terminal isotope labeling technique combining nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Leng, Jiapeng; Zhu, Dong; Wu, Duojiao; Zhu, Tongyu; Zhao, Ningwei; Guo, Yinlong

    2012-11-15

    Peptidomics analysis of human serum is challenging due to the low abundance of serum peptides and interference from the complex matrix. This study analyzed the differentially expressed (DE) low molecular weight peptides in human serum integrating a DMPITC-based N-terminal isotope labeling technique with nano-liquid chromatography and matrix-assisted laser desorption/ionization mass spectrometry (nano-LC/MALDI-MS). The workflow introduced a [d(6)]-4,6-dimethoxypyrimidine-2-isothiocyanate (DMPITC)-labeled mixture of aliquots from test samples as the internal standard. The spiked [d(0)]-DMPITC-labeled samples were separated by nano-LC then spotted on the MALDI target. Both quantitative and qualitative studies for serum peptides were achieved based on the isotope-labeled peaks. The DMPITC labeling technique combined with nano-LC/MALDI-MS not only minimized the errors in peptide quantitation, but also allowed convenient recognition of the labeled peptides due to the 6 Da mass difference. The data showed that the entire research procedure as well as the subsequent data analysis method were effective, reproducible, and sensitive for the analysis of DE serum peptides. This study successfully established a research model for DE serum peptides using DMPITC-based N-terminal isotope labeling and nano-LC/MALDI-MS. Application of the DMPITC-based N-terminal labeling technique is expected to provide a promising tool for the investigation of peptides in vivo, especially for the analysis of DE peptides under different biological conditions. Copyright © 2012 John Wiley & Sons, Ltd.

  3. Quantitative sensory testing response patterns to capsaicin- and ultraviolet-B–induced local skin hypersensitization in healthy subjects: a machine-learned analysis

    PubMed Central

    Lötsch, Jörn; Geisslinger, Gerd; Heinemann, Sarah; Lerch, Florian; Oertel, Bruno G.; Ultsch, Alfred

    2018-01-01

    Abstract The comprehensive assessment of pain-related human phenotypes requires combinations of nociceptive measures that produce complex high-dimensional data, posing challenges to bioinformatic analysis. In this study, we assessed established experimental models of heat hyperalgesia of the skin, consisting of local ultraviolet-B (UV-B) irradiation or capsaicin application, in 82 healthy subjects using a variety of noxious stimuli. We extended the original heat stimulation by applying cold and mechanical stimuli and assessing the hypersensitization effects with a clinically established quantitative sensory testing (QST) battery (German Research Network on Neuropathic Pain). This study provided a 246 × 10-sized data matrix (82 subjects assessed at baseline, following UV-B application, and following capsaicin application) with respect to 10 QST parameters, which we analyzed using machine-learning techniques. We observed statistically significant effects of the hypersensitization treatments in 9 different QST parameters. Supervised machine-learned analysis implemented as random forests followed by ABC analysis pointed to heat pain thresholds as the most relevantly affected QST parameter. However, decision tree analysis indicated that UV-B additionally modulated sensitivity to cold. Unsupervised machine-learning techniques, implemented as emergent self-organizing maps, hinted at subgroups responding to topical application of capsaicin. The distinction among subgroups was based on sensitivity to pressure pain, which could be attributed to sex differences, with women being more sensitive than men. Thus, while UV-B and capsaicin share a major component of heat pain sensitization, they differ in their effects on QST parameter patterns in healthy subjects, suggesting a lack of redundancy between these models. PMID:28700537

  4. Synthesis, growth, structure and nonlinear optical properties of a semiorganic 2-carboxy pyridinium dihydrogen phosphate single crystal

    NASA Astrophysics Data System (ADS)

    Nagapandiselvi, P.; Baby, C.; Gopalakrishnan, R.

    2015-09-01

    A new semiorganic compound namely, 2-carboxy pyridinium dihydrogen phosphate (2CPDP) was synthesised and grown as single crystals by slow evaporation solution growth technique. Single crystal XRD showed that 2CPDP belongs to monoclinic crystal system with space group P21/n. The molecular structure was further confirmed by modern spectroscopic techniques like FT-NMR (1H, 13C &31P), FT-IR, UV-Vis-NIR and Fluorescence. The UV-Vis-NIR analysis revealed suitability of the crystal for nonlinear optical applications. The photo active nature of the material is established from fluorescence studies. TG-DSC analysis showed that 2CPDP was thermally stable up to 170 °C. The dependence of dielectric properties on frequency and temperature were also studied. Nonlinear optical absorption determined from open aperture Z-Scan analysis by employing picosecond Nd-YAG laser, revealed that 2CPDP can serve as a promising candidate for optical limiting applications.

  5. Recent advances of liquid chromatography-(tandem) mass spectrometry in clinical and forensic toxicology - An update.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T

    2016-09-01

    Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Inference of the ring current ion composition by means of charge exchange decay

    NASA Technical Reports Server (NTRS)

    Smith, P. H.; Bewtra, N. K.; Hoffman, R. A.

    1978-01-01

    The analysis of the measured ion fluxes during the several day storm recovery period and the assumption that beside hydrogen other ions were present and that the decays were exponential in nature, it was possible to establish three separate lifetimes for the ions. These fitted decay lifetimes are in excellent agreement with the expected charge exchange decay lifetimes for H(+), O(+), and He(+) in the energy and L-value range of the data. This inference technique, thus, establishes the presence of measurable and appreciable quantities of oxygen and helium ions as well as protons in the storm-time ring current. Indications that He(+) may also be present under these same conditions were found.

  7. Profiling of Arabidopsis secondary metabolites by capillary liquid chromatography coupled to electrospray ionization quadrupole time-of-flight mass spectrometry.

    PubMed

    von Roepenack-Lahaye, Edda; Degenkolb, Thomas; Zerjeski, Michael; Franz, Mathias; Roth, Udo; Wessjohann, Ludger; Schmidt, Jürgen; Scheel, Dierk; Clemens, Stephan

    2004-02-01

    Large-scale metabolic profiling is expected to develop into an integral part of functional genomics and systems biology. The metabolome of a cell or an organism is chemically highly complex. Therefore, comprehensive biochemical phenotyping requires a multitude of analytical techniques. Here, we describe a profiling approach that combines separation by capillary liquid chromatography with the high resolution, high sensitivity, and high mass accuracy of quadrupole time-of-flight mass spectrometry. About 2000 different mass signals can be detected in extracts of Arabidopsis roots and leaves. Many of these originate from Arabidopsis secondary metabolites. Detection based on retention times and exact masses is robust and reproducible. The dynamic range is sufficient for the quantification of metabolites. Assessment of the reproducibility of the analysis showed that biological variability exceeds technical variability. Tools were optimized or established for the automatic data deconvolution and data processing. Subtle differences between samples can be detected as tested with the chalcone synthase deficient tt4 mutant. The accuracy of time-of-flight mass analysis allows to calculate elemental compositions and to tentatively identify metabolites. In-source fragmentation and tandem mass spectrometry can be used to gain structural information. This approach has the potential to significantly contribute to establishing the metabolome of Arabidopsis and other model systems. The principles of separation and mass analysis of this technique, together with its sensitivity and resolving power, greatly expand the range of metabolic profiling.

  8. Estimating psycho-physiological state of a human by speech analysis

    NASA Astrophysics Data System (ADS)

    Ronzhin, A. L.

    2005-05-01

    Adverse effects of intoxication, fatigue and boredom could degrade performance of highly trained operators of complex technical systems with potentially catastrophic consequences. Existing physiological fitness for duty tests are time consuming, costly, invasive, and highly unpopular. Known non-physiological tests constitute a secondary task and interfere with the busy workload of the tested operator. Various attempts to assess the current status of the operator by processing of "normal operational data" often lead to excessive amount of computations, poorly justified metrics, and ambiguity of results. At the same time, speech analysis presents a natural, non-invasive approach based upon well-established efficient data processing. In addition, it supports both behavioral and physiological biometric. This paper presents an approach facilitating robust speech analysis/understanding process in spite of natural speech variability and background noise. Automatic speech recognition is suggested as a technique for the detection of changes in the psycho-physiological state of a human that typically manifest themselves by changes of characteristics of voice tract and semantic-syntactic connectivity of conversation. Preliminary tests have confirmed that the statistically significant correlation between the error rate of automatic speech recognition and the extent of alcohol intoxication does exist. In addition, the obtained data allowed exploring some interesting correlations and establishing some quantitative models. It is proposed to utilize this approach as a part of fitness for duty test and compare its efficiency with analyses of iris, face geometry, thermography and other popular non-invasive biometric techniques.

  9. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  10. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  11. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  12. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  13. [Applications of three-dimensional fluorescence spectrum of dissolved organic matter to identification of red tide algae].

    PubMed

    Lü, Gui-Cai; Zhao, Wei-Hong; Wang, Jiang-Tao

    2011-01-01

    The identification techniques for 10 species of red tide algae often found in the coastal areas of China were developed by combining the three-dimensional fluorescence spectra of fluorescence dissolved organic matter (FDOM) from the cultured red tide algae with principal component analysis. Based on the results of principal component analysis, the first principal component loading spectrum of three-dimensional fluorescence spectrum was chosen as the identification characteristic spectrum for red tide algae, and the phytoplankton fluorescence characteristic spectrum band was established. Then the 10 algae species were tested using Bayesian discriminant analysis with a correct identification rate of more than 92% for Pyrrophyta on the level of species, and that of more than 75% for Bacillariophyta on the level of genus in which the correct identification rates were more than 90% for the phaeodactylum and chaetoceros. The results showed that the identification techniques for 10 species of red tide algae based on the three-dimensional fluorescence spectra of FDOM from the cultured red tide algae and principal component analysis could work well.

  14. Review of methods to probe single cell metabolism and bioenergetics

    DOE PAGES

    Vasdekis, Andreas E.; Stephanopoulos, Gregory

    2014-10-31

    The sampling and manipulation of cells down to the individual has been of substantial interest since the very beginning of Life Sciences. Herein, our objective is to highlight the most recent developments in single cell manipulation, as well as pioneering ones. First, flow-through methods will be discussed, namely methods in which the single cells flow continuously in an ordered manner during their analysis. This section will be followed by confinement techniques that enable cell isolation and confinement in one, two- or three-dimensions. Flow cytometry and droplet microfluidics are the two most common methods of flow-through analysis. While both are high-throughputmore » techniques, their difference lays in the fact that the droplet encapsulated cells experience a restricted and personal microenvironment, while in flow cytometry cells experience similar nutrient and stimuli initial concentrations. These methods are rather well established; however, they recently enabled immense strides in single cell phenotypic analysis, namely the identification and analysis of metabolically distinct individuals from an isogenic population using both droplet microfluidics and flow cytometry.« less

  15. Continuous Wavelet Transform, a powerful alternative to Derivative Spectrophotometry in analysis of binary and ternary mixtures: A comparative study.

    PubMed

    Elzanfaly, Eman S; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2015-12-05

    A comparative study was established between two signal processing techniques showing the theoretical algorithm for each method and making a comparison between them to indicate the advantages and limitations. The methods under study are Numerical Differentiation (ND) and Continuous Wavelet Transform (CWT). These methods were studied as spectrophotometric resolution tools for simultaneous analysis of binary and ternary mixtures. To present the comparison, the two methods were applied for the resolution of Bisoprolol (BIS) and Hydrochlorothiazide (HCT) in their binary mixture and for the analysis of Amlodipine (AML), Aliskiren (ALI) and Hydrochlorothiazide (HCT) as an example for ternary mixtures. By comparing the results in laboratory prepared mixtures, it was proven that CWT technique is more efficient and advantageous in analysis of mixtures with severe overlapped spectra than ND. The CWT was applied for quantitative determination of the drugs in their pharmaceutical formulations and validated according to the ICH guidelines where accuracy, precision, repeatability and robustness were found to be within the acceptable limit. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  17. Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique

    NASA Astrophysics Data System (ADS)

    Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.

    2015-12-01

    Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.

  18. Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan

    USGS Publications Warehouse

    Falconer, Allan; Cross, Matthew D.; Orr, Donald G.

    1990-01-01

    Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.

  19. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    PubMed

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salud, J.; Lopez, D.O.; Barrio, M.

    The experimental two-component phase diagram between the orientationally disordered crystals 2-amino-2-methyl-1,3-propanediol (AMP) and 1,1,1-tris(hydroxymethyl)propane (PG) has been established from room temperature to the liquid state using thermal analysis and X-ray powder diffraction techniques. The intermolecular interactions in the orientationally disordered mixed crystals of the mentioned system and other related two-component systems are discussed by analyzing the evolution of the packing coefficient as a function of the composition. A thermodynamic analysis of the presented phase diagram and the redetermined AMP/NPG (2,2-dimethyl-1,3-propanediol) is reported on the basis of the enthalpy-entropy compensation theory.

  1. A new technique for Auger analysis of surface species subject to electron-induced desorption

    NASA Technical Reports Server (NTRS)

    Pepper, S. V.

    1973-01-01

    A method is presented to observe surface species subject to electron-induced desorption by Auger electron spectroscopy. The surface to be examined is moved under the electron beam at constant velocity, establishing a time independent condition and eliminating the time response of the electron spectrometer as a limiting factor. The dependence of the Auger signal on the surface velocity, incident electron current, beam diameter, and desorption cross section are analyzed. The method is illustrated by the Auger analysis of PTFE, in which the fluorine is removed by electron induced desorption.

  2. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  3. Laser microprobe characterization of C species in Interplanetary Dust Particles (IDP)

    NASA Technical Reports Server (NTRS)

    Dibrozolo, F. R.; Bunch, T. E.; Chang, S.; Brownlee, D. E.

    1986-01-01

    Preliminary results of a study whose aim is the characterization of carbon (C) species in microvolumes of materials by means of laser ionization mass spectrometry (LIMS) are presented. The LIMS instrument employs a pulsed UV laser to produce nearly instantaneous vaporization and ionization of materials, followed by acceleration and time-of-flight analysis of the ions produced. LIMS provides a survey technique with nearly simultaneous acquisition of mass spectra covering the entire elemental range. The main limitation of the LIMS technique at present is its limited ability to perform quantitative analysis, due in part to insufficient knowledge of the mechanism of laser-solid interaction. However, considerable effort is now being directed at making LIMS a more quantitative technique. A variety of different C samples, both natural and man made were analyzed to establish the ability of LIMS to differentiate among the various C phases. The results of preliminary analyses performed on meteoritical and interplanetary dust samples are also presented. The C standards selected for the LIMS characterization range from essentially amorphous soot to diamond, which exhibits the highest degree of ordering.

  4. [Determination of benzo(alpha)pyrene in food with microwave-assisted extraction].

    PubMed

    Zhou, Na; Luo, He-Dong; Li, Na; Li, Yao-Qun

    2014-03-01

    Coupling derivative technique and constant-energy synchronous fluorescence scanning technique, a method of determining benzo[alpha] pyrene in foods by second derivative constant-energy synchronous spectrofluorimetry after microwave-assisted treatment of samples was established using domestic microwave oven. The main factors of influencing the efficiency of microwave extraction were discussed, including the extraction solvent types and amounts, the microwave extraction time, microwave radiation power and cooling time. And the comparison with ultrasonic extraction was made. Low-fat food samples, which were just microwave-extracted with mixed-solvents, could be analyzed immediately by the spectrofluorimetric technique. For high-fat food samples, microwave-assisted saponification and extraction were made at the same time, thus simplifying operation steps and reducing sample analysis time. So the whole sample analysis process could be completed within one hour. This method was simple, rapid and inexpensive. In consequence, it was applied to determine benzo(a)pyrene in food with good reproducibility and the recoveries of benzo(alpha) pyrene ranged from 90.0% to 105.0% for the low fat samples and 83.3% to 94.6% for high-fat samples.

  5. Advances in carbonate exploration and reservoir analysis

    USGS Publications Warehouse

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  6. Training staff serving clients with intellectual disabilities: a meta-analysis of aspects determining effectiveness.

    PubMed

    van Oorsouw, Wietske M W J; Embregts, Petri J C M; Bosman, Anna M T; Jahoda, Andrew

    2009-01-01

    The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff training that are related to improvements of staff behaviour. Our literature search concentrated on studies that were published in a period of 20 years. Fifty-five studies met the criteria, resulting in 502 single-subject designs and 13 n>1 designs. Results revealed important information relevant to further improvement of clinical practice: (a) the combination of in-service with coaching-on-the-job is the most powerful format, (b) in in-service formats, one should use multiple techniques, and verbal feedback is particularly recommended, and (c) in coaching-on-the-job formats, verbal feedback should be part of the program, as well as praise and correction. To maximize effectiveness, program developers should carefully prepare training goals, training format, and training techniques, which will yield a profit for clinical practice.

  7. Peri-procedural protocols for interventional pain management techniques: a survey of US pain centers.

    PubMed

    Ahmed, Shihab U; Tonidandel, William; Trella, Jason; Martin, Nicole M; Chang, Yuchiao

    2005-04-01

    Interventional techniques are now an integral part of chronic pain management. As new procedures are arising at a rapid pace, decisions regarding patient safety and comfort are becoming more challenging. No peri-procedural consensus protocol currently addresses issues such as 1. nulla per os (NPO) status, 2. sedation, 3. monitoring, or 4. recovery. In establishing safety guidelines for interventional pain procedures, the knowledge of current peri-procedural protocols is required. To survey interventional pain practices and to obtain current peri-procedural protocols. We faxed a one-page questionnaire to 105 United States pain practices identified using the directory of the American Pain Society. Fifty-seven academic and private pain practices (54%) responded and were included in the analysis. Monitoring devices such as electrocardiogram (EKG), blood pressure, and pulse oximetry are not universally employed for cervical or lumbar spinal procedures. Even procedures that are often performed by anesthesiologists in operating rooms, such as Bier blocks, are not monitored in a uniform manner when performed in pain clinics. Establishment of intravenous access for procedures also varies among practitioners. Most (72%) practices had treated patients with vasovagal reactions over the past 12 months, but only 42% had simulated cardiac arrests to prepare for these situations. While various trends in peri-procedural care are observable, standards of care are not well established. In order to minimize complications associated with interventional pain management techniques, the pain management community should agree on safety guidelines for all procedures, much as these advocated by the American Society of Anesthesiology for surgical anesthetic care.

  8. Time Management and the Military Decision Making Process

    DTIC Science & Technology

    1992-12-18

    This monograph analyzes the military decision making process in terms of time management in order to determine if a timeline will expedite the...process. The monograph begins by establishing the importance of time and time management in planning. This section provides a general discussion of time, an...Perhaps using some of the techniques that other armies use will facilitate time management .... Time management , Decision making, Timeline, Mission analysis, Wargaming, Courses of action, OPORD, Brigade OPS.

  9. Role of endoscopic ultrasonography in the diagnosis of acute and chronic pancreatitis.

    PubMed

    Stevens, Tyler

    2013-10-01

    Endoscopic ultrasonography (EUS) can be a useful tool for detecting underlying causes of acute pancreatitis and establishing the severity of fibrosis in chronic pancreatitis. Ancillary techniques include fine needle aspiration and core biopsy, bile collection for crystal analysis, pancreatic function testing, and celiac plexus block. This review focuses on the role of EUS in the diagnosis of acute and chronic pancreatitis. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, I.R.

    A pilot study in two states led to the establishment of the Dental Exposure Normalization Technique (DENT) program. This, in brief, is an exposure reduction and quality assurance program for radiological health agencies. The health agency sends X-ray exposure cards to dental X-ray facilities. These are exposed by the dentist and returned for analysis. Facilities which show excessive exposure are then visited to demonstrate the changes in exposure and processing necessary to produce diagnostic quality radiographs with minimum patient exposure.

  11. Extending the knowledge in histochemistry and cell biology.

    PubMed

    Heupel, Wolfgang-Moritz; Drenckhahn, Detlev

    2010-01-01

    Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.

  12. Drill hole logging with infrared spectroscopy

    USGS Publications Warehouse

    Calvin, W.M.; Solum, J.G.

    2005-01-01

    Infrared spectroscopy has been used to identify rocks and minerals for over 40 years. The technique is sensitive to primary silicates as well as alteration products. Minerals can be uniquely identified based on multiple absorption features at wavelengths from the visible to the thermal infrared. We are currently establishing methods and protocols in order to use the technique for rapid assessment of downhole lithology on samples obtained during drilling operations. Initial work performed includes spectral analysis of chip cuttings and core sections from drill sites around Desert Peak, NV. In this paper, we report on a survey of 10,000 feet of drill cuttings, at 100 foot intervals, from the San Andreas Fault Observatory at Depth (SAFOD). Data from Blue Mountain geothermal wells will also be acquired. We will describe the utility of the technique for rapid assessment of lithologic and mineralogic discrimination.

  13. Cesarean sections, perfecting the technique and standardizing the practice: an analysis of the book Obstetrícia, by Jorge de Rezende.

    PubMed

    Nakano, Andreza Rodrigues; Bonan, Claudia; Teixeira, Luiz Antônio

    2016-01-01

    This article discusses the development of techniques for cesarean sections by doctors in Brazil, during the 20th century, by analyzing the title "Operação Cesárea" (Cesarean Section), of three editions of the textbookObstetrícia, by Jorge de Rezende. His prominence as an author in obstetrics and his particular style of working, created the groundwork for the normalization of the practice of cesarean sections. The networks of meaning practiced within this scientific community included a "provision for feeling and for action" (Fleck) which established the C-section as a "normal" delivery: showing standards that exclude unpredictability, chaos, and dangers associated with the physiology of childbirth, meeting the demand for control, discipline and safety, qualities associated with practices, techniques and technologies of biomedicine.

  14. On the uniqueness of measuring elastoplastic properties from indentation: The indistinguishable mystical materials

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Ogasawara, Nagahisa; Zhao, Manhong; Chiba, Norimasa

    2007-08-01

    Indentation is widely used to extract material elastoplastic properties from the measured force-displacement curves. One of the most well-established indentation techniques utilizes dual (or plural) sharp indenters (which have different apex angles) to deduce key parameters such as the elastic modulus, yield stress, and work-hardening exponent for materials that obey the power-law constitutive relationship. However, the uniqueness of such analysis is not yet systematically studied or challenged. Here we show the existence of "mystical materials", which have distinct elastoplastic properties yet they yield almost identical indentation behaviors, even when the indenter angle is varied in a large range. These mystical materials are, therefore, indistinguishable by many existing indentation analyses unless extreme (and often impractical) indenter angles are used. Explicit procedures of deriving these mystical materials are established, and the general characteristics of the mystical materials are discussed. In many cases, for a given indenter angle range, a material would have infinite numbers of mystical siblings, and the existence maps of the mystical materials are also obtained. Furthermore, we propose two alternative techniques to effectively distinguish these mystical materials. The study in this paper addresses the important question of the uniqueness of indentation test, as well as providing useful guidelines to properly use the indentation technique to measure material elastoplastic properties.

  15. Mission Assurance Modeling and Simulation: A Cyber Security Roadmap

    NASA Technical Reports Server (NTRS)

    Gendron, Gerald; Roberts, David; Poole, Donold; Aquino, Anna

    2012-01-01

    This paper proposes a cyber security modeling and simulation roadmap to enhance mission assurance governance and establish risk reduction processes within constrained budgets. The term mission assurance stems from risk management work by Carnegie Mellon's Software Engineering Institute in the late 19905. By 2010, the Defense Information Systems Agency revised its cyber strategy and established the Program Executive Officer-Mission Assurance. This highlights a shift from simply protecting data to balancing risk and begins a necessary dialogue to establish a cyber security roadmap. The Military Operations Research Society has recommended a cyber community of practice, recognizing there are too few professionals having both cyber and analytic experience. The authors characterize the limited body of knowledge in this symbiotic relationship. This paper identifies operational and research requirements for mission assurance M&S supporting defense and homeland security. M&S techniques are needed for enterprise oversight of cyber investments, test and evaluation, policy, training, and analysis.

  16. Optimizing Tissue Sampling for the Diagnosis, Subtyping, and Molecular Analysis of Lung Cancer

    PubMed Central

    Ofiara, Linda Marie; Navasakulpong, Asma; Beaudoin, Stephane; Gonzalez, Anne Valerie

    2014-01-01

    Lung cancer has entered the era of personalized therapy with histologic subclassification and the presence of molecular biomarkers becoming increasingly important in therapeutic algorithms. At the same time, biopsy specimens are becoming increasingly smaller as diagnostic algorithms seek to establish diagnosis and stage with the least invasive techniques. Here, we review techniques used in the diagnosis of lung cancer including bronchoscopy, ultrasound-guided bronchoscopy, transthoracic needle biopsy, and thoracoscopy. In addition to discussing indications and complications, we focus our discussion on diagnostic yields and the feasibility of testing for molecular biomarkers such as epidermal growth factor receptor and anaplastic lymphoma kinase, emphasizing the importance of a sufficient tumor biopsy. PMID:25295226

  17. Assessing statistical differences between parameters estimates in Partial Least Squares path modeling.

    PubMed

    Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten

    2018-01-01

    Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.

  18. Modeling of electron time variations in the radiation belts

    NASA Technical Reports Server (NTRS)

    Chan, K. W.; Teague, M. J.; Schofield, N. J.; Vette, J. I.

    1979-01-01

    A review of the temporal variation in the trapped electron population of the inner and outer radiation zones is presented. Techniques presently used for modeling these zones are discussed and their deficiencies identified. An intermediate region is indicated between the zones in which the present modeling techniques are inadequate due to the magnitude and frequency of magnetic storms. Future trends are examined, and it is suggested that modeling of individual magnetic storms may be required in certain L bands. An analysis of seven magnetic storms is presented, establishing the independence of the depletion time of the storm flux and the storm magnitude. Provisional correlation between the storm magnitude and the Dst index is demonstrated.

  19. On the integration of reinforcement learning and approximate reasoning for control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1991-01-01

    The author discusses the importance of strengthening the knowledge representation characteristic of reinforcement learning techniques using methods such as approximate reasoning. The ARIC (approximate reasoning-based intelligent control) architecture is an example of such a hybrid approach in which the fuzzy control rules are modified (fine-tuned) using reinforcement learning. ARIC also demonstrates that it is possible to start with an approximately correct control knowledge base and learn to refine this knowledge through further experience. On the other hand, techniques such as the TD (temporal difference) algorithm and Q-learning establish stronger theoretical foundations for their use in adaptive control and also in stability analysis of hybrid reinforcement learning and approximate reasoning-based controllers.

  20. Mixture toxicity revisited from a toxicogenomic perspective.

    PubMed

    Altenburger, Rolf; Scholz, Stefan; Schmitt-Jansen, Mechthild; Busch, Wibke; Escher, Beate I

    2012-03-06

    The advent of new genomic techniques has raised expectations that central questions of mixture toxicology such as for mechanisms of low dose interactions can now be answered. This review provides an overview on experimental studies from the past decade that address diagnostic and/or mechanistic questions regarding the combined effects of chemical mixtures using toxicogenomic techniques. From 2002 to 2011, 41 studies were published with a focus on mixture toxicity assessment. Primarily multiplexed quantification of gene transcripts was performed, though metabolomic and proteomic analysis of joint exposures have also been undertaken. It is now standard to explicitly state criteria for selecting concentrations and provide insight into data transformation and statistical treatment with respect to minimizing sources of undue variability. Bioinformatic analysis of toxicogenomic data, by contrast, is still a field with diverse and rapidly evolving tools. The reported combined effect assessments are discussed in the light of established toxicological dose-response and mixture toxicity models. Receptor-based assays seem to be the most advanced toward establishing quantitative relationships between exposure and biological responses. Often transcriptomic responses are discussed based on the presence or absence of signals, where the interpretation may remain ambiguous due to methodological problems. The majority of mixture studies design their studies to compare the recorded mixture outcome against responses for individual components only. This stands in stark contrast to our existing understanding of joint biological activity at the levels of chemical target interactions and apical combined effects. By joining established mixture effect models with toxicokinetic and -dynamic thinking, we suggest a conceptual framework that may help to overcome the current limitation of providing mainly anecdotal evidence on mixture effects. To achieve this we suggest (i) to design studies to establish quantitative relationships between dose and time dependency of responses and (ii) to adopt mixture toxicity models. Moreover, (iii) utilization of novel bioinformatic tools and (iv) stress response concepts could be productive to translate multiple responses into hypotheses on the relationships between general stress and specific toxicity reactions of organisms.

  1. Teaching audience analysis to the technical student

    NASA Technical Reports Server (NTRS)

    Debs, M. B.; Brillhart, L. V.

    1981-01-01

    Teaching audience analysis, as practiced in a technical writing course for engineering students, is discussed. Audience analysis is described as the task of defining the audience for a particular piece of writing and determining those characteristics of the audience which constrain the writer and effect reception of the message. A mature technical writing style that shows the tension produced when a text is written to be read and understood is considered in terms of audience analysis. Techniques include: (1) conveying to students the concept that a reader with certain expectations exist, (2) team teaching to preserve the context of a given technical discipline, and (3) assigning a technical report that addresses a variety of readers, thus establishing the complexity of audience oriented writing.

  2. Investigation of historical metal objects using Laser Induced Breakdown Spectroscopy (LIBS) technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.

    2011-09-22

    Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less

  3. OB glue paste technique for establishing nude mouse human gastric cancer orthotopic transplantation models

    PubMed Central

    Shi, Jun; Wei, Pin-Kang; Zhang, Shen; Qin, Zhi-Feng; Li, Jun; Sun, Da-Zhi; Xiao, Yan; Yu, Zhi-Hong; Lin, Hui-Ming; Zheng, Guo-Jing; Su, Xiao-Mei; Chen, Ya-Lin; Liu, Yan-Fang; Xu, Ling

    2008-01-01

    AIM: To establish nude mouse human gastric cancer orthotopic transplantation models using OB glue paste technique. METHODS: Using OB glue paste technique, orthotopic transplantation models were established by implanting SGC-7901 and MKN-45 human gastric cancer cell strains into the gastric wall of nude mice. Biological features, growth of the implanted tumors, the success rate of transplantation and the rate of auto-metastasis of the two models were observed. RESULTS: The success rates of orthotopic transplan-tation of the two models were 94.20% and 96%. The rates of hepatic metastasis, pulmonary metastasis, peritoneal metastasis, lymphocytic metastasis and splenic metastasis were 42.13% and 94.20%, 48.43% and 57.97%, 30.83% and 36.96%, 67.30% and 84.06%, and 59.75% and 10.53%, respectively. The occurrence of ascites was 47.80% and 36.96%. CONCLUSION: OB glue paste technique is easy to follow. The biological behaviors of the nude mouse human gastric cancer orthotopic transplantation models established with this technique are similar to the natural processes of growth and metastasis of human gastric cancer, and, therefore, can be used as an ideal model for experimental research of proliferative metastasis of tumors. PMID:18720543

  4. Safety Analysis and Protection Measures of the Control System of the Pulsed High Magnetic Field Facility in WHMFC

    NASA Astrophysics Data System (ADS)

    Shi, J. T.; Han, X. T.; Xie, J. F.; Yao, L.; Huang, L. T.; Li, L.

    2013-03-01

    A Pulsed High Magnetic Field Facility (PHMFF) has been established in Wuhan National High Magnetic Field Center (WHMFC) and various protection measures are applied in its control system. In order to improve the reliability and robustness of the control system, the safety analysis of the PHMFF is carried out based on Fault Tree Analysis (FTA) technique. The function and realization of 5 protection systems, which include sequence experiment operation system, safety assistant system, emergency stop system, fault detecting and processing system and accident isolating protection system, are given. The tests and operation indicate that these measures improve the safety of the facility and ensure the safety of people.

  5. Exploring Surface Analysis Techniques for the Detection of Molecular Contaminants on Spacecraft

    NASA Technical Reports Server (NTRS)

    Rutherford, Gugu N.; Seasly, Elaine; Thornblom, Mark; Baughman, James

    2016-01-01

    Molecular contamination is a known area of concern for spacecraft. To mitigate this risk, projects involving space flight hardware set requirements in a contamination control plan that establishes an allocation budget for the exposure of non-volatile residues (NVR) onto critical surfaces. The purpose of this work will focus on non-contact surface analysis and in situ monitoring to mitigate molecular contamination on space flight hardware. By using Scanning Electron Microscopy and Energy Dispersive Spectroscopy (SEM-EDS) with Raman Spectroscopy, an unlikely contaminant was identified on space flight hardware. Using traditional and surface analysis methods provided the broader view of the contamination sources allowing for best fit solutions to prevent any future exposure.

  6. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    NASA Astrophysics Data System (ADS)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  7. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming

    PubMed Central

    van Elk, Michiel; Matzke, Dora; Gronau, Quentin F.; Guan, Maime; Vandekerckhove, Joachim; Wagenmakers, Eric-Jan

    2015-01-01

    According to a recent meta-analysis, religious priming has a positive effect on prosocial behavior (Shariff et al., 2015). We first argue that this meta-analysis suffers from a number of methodological shortcomings that limit the conclusions that can be drawn about the potential benefits of religious priming. Next we present a re-analysis of the religious priming data using two different meta-analytic techniques. A Precision-Effect Testing–Precision-Effect-Estimate with Standard Error (PET-PEESE) meta-analysis suggests that the effect of religious priming is driven solely by publication bias. In contrast, an analysis using Bayesian bias correction suggests the presence of a religious priming effect, even after controlling for publication bias. These contradictory statistical results demonstrate that meta-analytic techniques alone may not be sufficiently robust to firmly establish the presence or absence of an effect. We argue that a conclusive resolution of the debate about the effect of religious priming on prosocial behavior – and about theoretically disputed effects more generally – requires a large-scale, preregistered replication project, which we consider to be the sole remedy for the adverse effects of experimenter bias and publication bias. PMID:26441741

  8. 3D Image Analysis of Geomaterials using Confocal Microscopy

    NASA Astrophysics Data System (ADS)

    Mulukutla, G.; Proussevitch, A.; Sahagian, D.

    2009-05-01

    Confocal microscopy is one of the most significant advances in optical microscopy of the last century. It is widely used in biological sciences but its application to geomaterials lingers due to a number of technical problems. Potentially the technique can perform non-invasive testing on a laser illuminated sample that fluoresces using a unique optical sectioning capability that rejects out-of-focus light reaching the confocal aperture. Fluorescence in geomaterials is commonly induced using epoxy doped with a fluorochrome that is impregnated into the sample to enable discrimination of various features such as void space or material boundaries. However, for many geomaterials, this method cannot be used because they do not naturally fluoresce and because epoxy cannot be impregnated into inaccessible parts of the sample due to lack of permeability. As a result, the confocal images of most geomaterials that have not been pre-processed with extensive sample preparation techniques are of poor quality and lack the necessary image and edge contrast necessary to apply any commonly used segmentation techniques to conduct any quantitative study of its features such as vesicularity, internal structure, etc. In our present work, we are developing a methodology to conduct a quantitative 3D analysis of images of geomaterials collected using a confocal microscope with minimal amount of prior sample preparation and no addition of fluorescence. Two sample geomaterials, a volcanic melt sample and a crystal chip containing fluid inclusions are used to assess the feasibility of the method. A step-by-step process of image analysis includes application of image filtration to enhance the edges or material interfaces and is based on two segmentation techniques: geodesic active contours and region competition. Both techniques have been applied extensively to the analysis of medical MRI images to segment anatomical structures. Preliminary analysis suggests that there is distortion in the shapes of the segmented vesicles, vapor bubbles, and void spaces due to the optical measurements, so corrective actions are being explored. This will establish a practical and reliable framework for an adaptive 3D image processing technique for the analysis of geomaterials using confocal microscopy.

  9. A new methodology based on functional principal component analysis to study postural stability post-stroke.

    PubMed

    Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz

    2018-05-05

    A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  11. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  12. Study on rapid valid acidity evaluation of apple by fiber optic diffuse reflectance technique

    NASA Astrophysics Data System (ADS)

    Liu, Yande; Ying, Yibin; Fu, Xiaping; Jiang, Xuesong

    2004-03-01

    Some issues related to nondestructive evaluation of valid acidity in intact apples by means of Fourier transform near infrared (FTNIR) (800-2631nm) method were addressed. A relationship was established between the diffuse reflectance spectra recorded with a bifurcated optic fiber and the valid acidity. The data were analyzed by multivariate calibration analysis such as partial least squares (PLS) analysis and principal component regression (PCR) technique. A total of 120 Fuji apples were tested and 80 of them were used to form a calibration data set. The influence of data preprocessing and different spectra treatments were also investigated. Models based on smoothing spectra were slightly worse than models based on derivative spectra and the best result was obtained when the segment length was 5 and the gap size was 10. Depending on data preprocessing and multivariate calibration technique, the best prediction model had a correlation efficient (0.871), a low RMSEP (0.0677), a low RMSEC (0.056) and a small difference between RMSEP and RMSEC by PLS analysis. The results point out the feasibility of FTNIR spectral analysis to predict the fruit valid acidity non-destructively. The ratio of data standard deviation to the root mean square error of prediction (SDR) is better to be less than 3 in calibration models, however, the results cannot meet the demand of actual application. Therefore, further study is required for better calibration and prediction.

  13. Metabolic changes associated with papillary thyroid carcinoma: A nuclear magnetic resonance-based metabolomics study.

    PubMed

    Li, Yanyun; Chen, Minjian; Liu, Cuiping; Xia, Yankai; Xu, Bo; Hu, Yanhui; Chen, Ting; Shen, Meiping; Tang, Wei

    2018-05-01

    Papillary thyroid carcinoma (PTC) is the most common thyroid cancer. Nuclear magnetic resonance (NMR)‑based metabolomic technique is the gold standard in metabolite structural elucidation, and can provide different coverage of information compared with other metabolomic techniques. Here, we firstly conducted NMR based metabolomics study regarding detailed metabolic changes especially metabolic pathway changes related to PTC pathogenesis. 1H NMR-based metabolomic technique was adopted in conju-nction with multivariate analysis to analyze matched tumor and normal thyroid tissues obtained from 16 patients. The results were further annotated with Kyoto Encyclopedia of Genes and Genomes (KEGG), and Human Metabolome Database, and then were analyzed using modules of pathway analysis and enrichment analysis of MetaboAnalyst 3.0. Based on the analytical techniques, we established the models of principal component analysis (PCA), partial least squares-discriminant analysis (PLS-DA), and orthogonal partial least-squares discriminant analysis (OPLS‑DA) which could discriminate PTC from normal thyroid tissue, and found 15 robust differentiated metabolites from two OPLS-DA models. We identified 8 KEGG pathways and 3 pathways of small molecular pathway database which were significantly related to PTC by using pathway analysis and enrichment analysis, respectively, through which we identified metabolisms related to PTC including branched chain amino acid metabolism (leucine and valine), other amino acid metabolism (glycine and taurine), glycolysis (lactate), tricarboxylic acid cycle (citrate), choline metabolism (choline, ethanolamine and glycerolphosphocholine) and lipid metabolism (very-low‑density lipoprotein and low-density lipoprotein). In conclusion, the PTC was characterized with increased glycolysis and inhibited tricarboxylic acid cycle, increased oncogenic amino acids as well as abnormal choline and lipid metabolism. The findings in this study provide new insights into detailed metabolic changes of PTC, and hold great potential in the treatment of PTC.

  14. 100 Most Influential Publications in Scoliosis Surgery.

    PubMed

    Zhou, James Jun; Koltz, Michael T; Agarwal, Nitin; Tempel, Zachary J; Kanter, Adam S; Okonkwo, David O; Hamilton, D Kojo

    2017-03-01

    Bibliometric analysis. To apply the established technique of citation analysis to identify the 100 most influential articles in scoliosis surgery research published between 1900 and 2015. Previous studies have applied the technique of citation analysis to other areas of study. This is the first article to apply this technique to the field of scoliosis surgery. A two-step search of the Thomson Reuters Web of Science was conducted to identify all articles relevant to the field of scoliosis surgery. The top 100 articles with the most citations were identified based on analysis of titles and abstracts. Further statistical analysis was conducted to determine whether measures of author reputation and overall publication influence affected the rate at which publications were recognized and incorporated by other researchers in the field. Total citations for the final 100 publications included in the list ranged from 82 to 509. The period for publication ranged from 1954 to 2010. Most studies were published in the journal Spine (n = 63). The most frequently published topics of study were surgical techniques (n = 35) and outcomes (n = 35). Measures of author reputation (number of total studies in the top 100, number of first-author studies in the top 100) were found to have no effect on the rate at which studies were adopted by other researchers (number of years until first citation, and number of years until maximum citations). The number of citations/year a publication received was found to be negatively correlated with the rate at which it was adopted by other researchers, indicating that more influential manuscripts attained more rapid recognition by the scientific community at large. In assembling this publication, we have strived to identify and recognize the 100 most influential articles in scoliosis surgery research from 1900 to 2015. N/A.

  15. MIBPB: a software package for electrostatic analysis.

    PubMed

    Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei

    2011-03-01

    The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This article presents a matched interface and boundary (MIB)-based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique-based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces, which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second-order convergence, that is, the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping technique that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1 Å--whereas it usually takes other traditional PB solvers 0.25 Å to reach similar level of reliability. This work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by using the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate KS solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. Copyright © 2010 Wiley Periodicals, Inc.

  16. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition.

    PubMed

    Rhee, Ye-Kyu; Huh, Yoon-Hyuk; Cho, Lee-Ra; Park, Chan-Jin

    2015-12-01

    The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05).

  17. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  18. MIBPB: A software package for electrostatic analysis

    PubMed Central

    Chen, Duan; Chen, Zhan; Chen, Changjun; Geng, Weihua; Wei, Guo-Wei

    2010-01-01

    The Poisson-Boltzmann equation (PBE) is an established model for the electrostatic analysis of biomolecules. The development of advanced computational techniques for the solution of the PBE has been an important topic in the past two decades. This paper presents a matched interface and boundary (MIB) based PBE software package, the MIBPB solver, for electrostatic analysis. The MIBPB has a unique feature that it is the first interface technique based PBE solver that rigorously enforces the solution and flux continuity conditions at the dielectric interface between the biomolecule and the solvent. For protein molecular surfaces which may possess troublesome geometrical singularities, the MIB scheme makes the MIBPB by far the only existing PBE solver that is able to deliver the second order convergence, i.e., the accuracy increases four times when the mesh size is halved. The MIBPB method is also equipped with a Dirichlet-to-Neumann mapping (DNM) technique, that builds a Green's function approach to analytically resolve the singular charge distribution in biomolecules in order to obtain reliable solutions at meshes as coarse as 1Å — while it usually takes other traditional PB solvers 0.25Å to reach similar level of reliability. The present work further accelerates the rate of convergence of linear equation systems resulting from the MIBPB by utilizing the Krylov subspace (KS) techniques. Condition numbers of the MIBPB matrices are significantly reduced by using appropriate Krylov subspace solver and preconditioner combinations. Both linear and nonlinear PBE solvers in the MIBPB package are tested by protein-solvent solvation energy calculations and analysis of salt effects on protein-protein binding energies, respectively. PMID:20845420

  19. Comparison of intraoral scanning and conventional impression techniques using 3-dimensional superimposition

    PubMed Central

    Rhee, Ye-Kyu

    2015-01-01

    PURPOSE The aim of this study is to evaluate the appropriate impression technique by analyzing the superimposition of 3D digital model for evaluating accuracy of conventional impression technique and digital impression. MATERIALS AND METHODS Twenty-four patients who had no periodontitis or temporomandibular joint disease were selected for analysis. As a reference model, digital impressions with a digital impression system were performed. As a test models, for conventional impression dual-arch and full-arch, impression techniques utilizing addition type polyvinylsiloxane for fabrication of cast were applied. 3D laser scanner is used for scanning the cast. Each 3 pairs for 25 STL datasets were imported into the inspection software. The three-dimensional differences were illustrated in a color-coded map. For three-dimensional quantitative analysis, 4 specified contact locations(buccal and lingual cusps of second premolar and molar) were established. For twodimensional quantitative analysis, the sectioning from buccal cusp to lingual cusp of second premolar and molar were acquired depending on the tooth axis. RESULTS In color-coded map, the biggest difference between intraoral scanning and dual-arch impression was seen (P<.05). In three-dimensional analysis, the biggest difference was seen between intraoral scanning and dual-arch impression and the smallest difference was seen between dual-arch and full-arch impression. CONCLUSION The two- and three-dimensional deviations between intraoral scanner and dual-arch impression was bigger than full-arch and dual-arch impression (P<.05). The second premolar showed significantly bigger three-dimensional deviations than the second molar in the three-dimensional deviations (P>.05). PMID:26816576

  20. Emplacement of Basaltic Lava Flows: the Legacy of GPL Walker

    NASA Astrophysics Data System (ADS)

    Cashman, K. V.

    2005-12-01

    Through his early field measurements of lava flow morphology, G.P.L. Walker established a framework for examination of the dynamics of lava flow emplacement that is still in place today. I will examine this legacy as established by three early papers: (1) his 1967 paper, where he defined a relationship between the thickness of recent Etna lava flows and the slope over which they flowed, a relationship that he ascribed to lava viscosity; (2) his 1971 paper, which defined a relationship between lava flux and the formation of simple and compound flow units that he used to infer high effusion rates for the emplacement of some flood basalt lavas; and (3) his often-cited 1973 paper, which related the length of lava flows to their average effusion rate. These three papers, all similar in their basic approach of using field measurements of lava flow morphology to extract fundamental relationships between eruption conditions (magma flux and rheology) and emplacement style (flow length and thickness), firmly established the relationship between flow morphology and emplacement dynamics that has since been widely applied not only to subaerial lava flows, but also to the interpretation of flows in submarine and planetary environments. Important extensions of these concepts have been provided by improved field observation methods, particularly for analysis of flowing lava, by laboratory measurements of lava rheology, by the application of analog experiments to lava flow dynamics, and by steady improvement of numerical techniques to model the flow of lava over complex terrain. The real legacy of G.P.L. Walker's field measurement approach, however, may lie in the future, as new topographic measurement techniques such as LIDAR hold exciting promise for truly quantitative analysis of lava flow morphologies and their relationship to flow dynamics.

  1. Multi-temporal change image inference towards false alarms reduction for an operational photogrammetric rockfall detection system

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas

    2015-06-01

    Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.

  2. Job-related stress in psychiatric nurses in Japan caring for elderly patients with dementia.

    PubMed

    Yada, Hironori; Abe, Hiroshi; Lu, Xi; Wakizaki, Yuko; Omori, Hisamitsu; Matsuo, Hisae; Ishida, Yasushi; Katoh, Takahiko

    2014-11-01

    We investigated the specificity and structures of job-related stress in psychiatric dementia nurses (PDNs) caring for elderly patients with serious behavioral and psychological symptoms of dementia who required substantial assistance with activities of daily living, in order to obtain fundamental knowledge toward providing mental health care for these nurses. Subjects were 244 nurses [63 PDNs and 181 other psychiatric nurses (OPNs)] Analysis of covariance to examine the specificity of job-related stress in PDNs revealed physical workload and work environment to be more significant stressors, and irritability and anxiety to be more significant stress reactions in PDNs than in OPNs. An examination of PDNs' job-related stress structures established in a structural equation model with two stress reactions confirmed as specific outcomes for PDNs revealed a significant positive influence of work environment on irritability; utilization of techniques for anxiety and physical workload influenced both stress reactions. Our findings highlight the importance of reducing physical workload and environment and establishing a structure for nursing techniques in psychiatric dementia wards to improve the mental health of PDNs.

  3. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  4. Environmental epigenomics: Current approaches to assess epigenetic effects of endocrine disrupting compounds (EDC's) on human health.

    PubMed

    Tapia-Orozco, Natalia; Santiago-Toledo, Gerardo; Barrón, Valeria; Espinosa-García, Ana María; García-García, José Antonio; García-Arrazola, Roeb

    2017-04-01

    Environmental Epigenomics is a developing field to study the epigenetic effect on human health from exposure to environmental factors. Endocrine disrupting chemicals have been detected primarily in pharmaceutical drugs, personal care products, food additives, and food containers. Exposure to endocrine-disrupting chemicals (EDCs) has been associated with a high incidence and prevalence of many endocrine-related disorders in humans. Nevertheless, further evidence is needed to establish a correlation between exposure to EDC and human disorders. Conventional detection of EDCs is based on chemical structure and concentration sample analysis. However, substantial evidence has emerged, suggesting that cell exposure to EDCs leads to epigenetic changes, independently of its chemical structure with non-monotonic low-dose responses. Consequently, a paradigm shift in toxicology assessment of EDCs is proposed based on a comprehensive review of analytical techniques used to evaluate the epigenetic effects. Fundamental insights reported elsewhere are compared in order to establish DNA methylation analysis as a viable method for assessing endocrine disruptors beyond the conventional study approach of chemical structure and concentration analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Craniofacial Anomalies And Biostereometrics

    NASA Astrophysics Data System (ADS)

    Christiansen, Richard L.

    1980-07-01

    Man's oral-facial structures are vital for the functions of breathing, mastication, swallowing, vision, and communication. When defective development of these tissues occurs, function becomes impaired and the anatomic features of the afflicted individual will frequently deviate from the norm. This error of form and function will classify the individual as being physically and psychosocially handicapped. The successful habilitation regimen of the handicapped person depends on the accurate analysis of both craniofacial anatomy and physiology of these individuals, as well as psychological implications of the disfigurement. Biostereometrics can contribute to the establishment of operationally valid measures for assessing the severity of the handicapping conditions. The heterogeneous nature of diverse disfigurement suggest that an improved classification of malformations would be beneficial. Three-dimensional analysis may also have significant influence on the accuracy of the diagnosis, and the establishment of a biologically sound treatment plan. Biostereometrics will contribute more fully if the three-dimensional surface analysis can be coordinated with a study of 1) the underlying skeletal structures, and 2) the operational musculature. Increased communication between the stereometric experts and the biological scientists should accelerate the application of this technique to the health problem.

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  7. Cyclic Symmetry Finite Element Forced Response Analysis of a Distortion-Tolerant Fan with Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Reddy, T. S. R.; Bakhle, M. A.; Coroneos, R. M.; Stefko, G. L.; Provenza, A. J.; Duffy, K. P.

    2018-01-01

    Accurate prediction of the blade vibration stress is required to determine overall durability of fan blade design under Boundary Layer Ingestion (BLI) distorted flow environments. Traditional single blade modeling technique is incapable of representing accurate modeling for the entire rotor blade system subject to complex dynamic loading behaviors and vibrations in distorted flow conditions. A particular objective of our work was to develop a high-fidelity full-rotor aeromechanics analysis capability for a system subjected to a distorted inlet flow by applying cyclic symmetry finite element modeling methodology. This reduction modeling method allows computationally very efficient analysis using a small periodic section of the full rotor blade system. Experimental testing by the use of the 8-foot by 6-foot Supersonic Wind Tunnel Test facility at NASA Glenn Research Center was also carried out for the system designated as the Boundary Layer Ingesting Inlet/Distortion-Tolerant Fan (BLI2DTF) technology development. The results obtained from the present numerical modeling technique were evaluated with those of the wind tunnel experimental test, toward establishing a computationally efficient aeromechanics analysis modeling tool facilitating for analyses of the full rotor blade systems subjected to a distorted inlet flow conditions. Fairly good correlations were achieved hence our computational modeling techniques were fully demonstrated. The analysis result showed that the safety margin requirement set in the BLI2DTF fan blade design provided a sufficient margin with respect to the operating speed range.

  8. Whole-body impedance--what does it measure?

    PubMed

    Foster, K R; Lukaski, H C

    1996-09-01

    Although the bioelectrical impedance technique is widely used in human nutrition and clinical research, an integrated summary of the biophysical and bioelectrical bases of this approach is lacking. We summarize the pertinent electrical phenomena relevant to the application of the impedance technique in vivo and discuss the relations between electrical measurements and biological conductor volumes. Key terms in the derivation of bioelectrical impedance analysis are described and the relation between the electrical properties of tissues and tissue structure is discussed. The relation between the impedance of an object and its geometry, scale, and intrinsic electrical properties is also discussed. Correlations between whole-body impedance measurements and various bioconductor volumes, such as total body water and fat-free mass, are experimentally well established; however, the reason for the success of the impedence technique is much less clear. The bioengineering basis for the technique is critically presented and considerations are proposed that might help to clarify the method and potentially improve its sensitivity.

  9. Effect of land-applied biosolids on surface-water nutrient yields and groundwater quality in Orange County, North Carolina

    USGS Publications Warehouse

    Wagner, Chad R.; Fitzgerald, Sharon A.; McSwain, Kristen Bukowski; Harden, Stephen L.; Gurley, Laura N.; Rogers, Shane W.

    2015-01-01

    The data, analysis, and conclusions associated with this study can be used by regulatory agencies, resource managers, and wastewater-treatment operators to (1) better understand the quantity and characteristics of nutrients, bacteria, metals, and contaminants of emerging concern that are transported away from biosolids land-application fields to surface water and groundwater under current regulations for the purposes of establishing effective total maximum daily loads (TMDLs) and restoring impaired water resources, (2) assess how well existing regulations protect waters of the State and potentially recommend effective changes to regulations or land-application procedures, and (3) establish a framework for developing guidance on effective techniques for monitoring and regulatory enforcement of permitted biosolids land-application fields.

  10. Bounded extremum seeking with discontinuous dithers

    DOE PAGES

    Scheinker, Alexander; Scheinker, David

    2016-03-21

    The analysis of discontinuous extremum seeking (ES) controllers, e.g. those applicable to digital systems, has historically been more complicated than that of continuous controllers. We establish a simple and general extension of a recently developed bounded form of ES to a general class of oscillatory functions, including functions discontinuous with respect to time, such as triangle or square waves with dead time. We establish our main results by combining a novel idea for oscillatory control with an extension of functional analytic techniques originally utilized by Kurzweil, Jarnik, Sussmann, and Liu in the late 80s and early 90s and recently studiedmore » by Durr et al. Lastly, we demonstrate the value of the result with an application to inverter switching control.« less

  11. Application of non-attenuating frequency radars for prediction of rain attenuation and space diversity performance

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1979-01-01

    In order to establish transmitter power and receiver sensitivity levels at frequencies above 10 GHz, the designers of earth-satellite telecommunication systems are interested in cumulative rain fade statistics at variable path orientations, elevation angles, climatological regions, and frequencies. They are also interested in establishing optimum space diversity performance parameters. In this work are examined the many elements involved in the employment of single non-attenuating frequency radars for arriving at the desired information. The elements examined include radar techniques and requirements, phenomenological assumptions, path attenation formulations and procedures, as well as error budgeting and calibration analysis. Included are the pertinent results of previous investigators who have used radar for rain attenuation modeling. Suggestions are made for improving present methods.

  12. OASIS: Organics Analyzer for Sampling Icy Surfaces

    NASA Technical Reports Server (NTRS)

    Getty, S. A.; Dworkin, J. P.; Glavin, D. P.; Martin, M.; Zheng, Y.; Balvin, M.; Southard, A. E.; Ferrance, J.; Malespin, C.

    2012-01-01

    Liquid chromatography mass spectrometry (LC-MS) is a well established laboratory technique for detecting and analyzing organic molecules. This approach has been especially fruitful in the analysis of nucleobases, amino acids, and establishing chirol ratios [1 -3]. We are developing OASIS, Organics Analyzer for Sampling Icy Surfaces, for future in situ landed missions to astrochemically important icy bodies, such as asteroids, comets, and icy moons. The OASIS design employs a microfabricated, on-chip analytical column to chromatographically separate liquid ana1ytes using known LC stationary phase chemistries. The elution products are then interfaced through electrospray ionization (ESI) and analyzed by a time-of-flight mass spectrometer (TOF-MS). A particular advantage of this design is its suitability for microgravity environments, such as for a primitive small body.

  13. A pleiotropy-informed Bayesian false discovery rate adapted to a shared control design finds new disease associations from GWAS summary statistics.

    PubMed

    Liley, James; Wallace, Chris

    2015-02-01

    Genome-wide association studies (GWAS) have been successful in identifying single nucleotide polymorphisms (SNPs) associated with many traits and diseases. However, at existing sample sizes, these variants explain only part of the estimated heritability. Leverage of GWAS results from related phenotypes may improve detection without the need for larger datasets. The Bayesian conditional false discovery rate (cFDR) constitutes an upper bound on the expected false discovery rate (FDR) across a set of SNPs whose p values for two diseases are both less than two disease-specific thresholds. Calculation of the cFDR requires only summary statistics and have several advantages over traditional GWAS analysis. However, existing methods require distinct control samples between studies. Here, we extend the technique to allow for some or all controls to be shared, increasing applicability. Several different SNP sets can be defined with the same cFDR value, and we show that the expected FDR across the union of these sets may exceed expected FDR in any single set. We describe a procedure to establish an upper bound for the expected FDR among the union of such sets of SNPs. We apply our technique to pairwise analysis of p values from ten autoimmune diseases with variable sharing of controls, enabling discovery of 59 SNP-disease associations which do not reach GWAS significance after genomic control in individual datasets. Most of the SNPs we highlight have previously been confirmed using replication studies or larger GWAS, a useful validation of our technique; we report eight SNP-disease associations across five diseases not previously declared. Our technique extends and strengthens the previous algorithm, and establishes robust limits on the expected FDR. This approach can improve SNP detection in GWAS, and give insight into shared aetiology between phenotypically related conditions.

  14. Analysis of space tug operating techniques (study 2.4). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The costs of tug refurbishment were studied, using existing cost estimating relationships, to establish the cost of maintaining the reusable third stage of the space transportation system. Refurbishment operations sheets which describe the actual tasks that are necessary to keep the equipment functioning properly were used along with refurbishment operations sheets which contain all of the pertinent descriptive information for each of the major vehicle areas. Tug refurbishment costs per mission are tabulated.

  15. Space shuttle propulsion systems on-board checkout and monitoring system development study (extension). Volume 1: Summary and technical results

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An analysis was conducted of the space shuttle propulsion systems to define the onboard checkout and monitoring function. A baseline space shuttle vehicle and mission were used to establish the techniques and approach for defining the requirements. The requirements were analyzed to formulate criteria for implementing the functions of preflight checkout, performance monitoring, fault isolation, emergency detection, display, data storage, postflight evaluation, and maintenance retest.

  16. [Application of Finite Element Method in Thoracolumbar Spine Traumatology].

    PubMed

    Zhang, Min; Qiu, Yong-gui; Shao, Yu; Gu, Xiao-feng; Zeng, Ming-wei

    2015-04-01

    The finite element method (FEM) is a mathematical technique using modern computer technology for stress analysis, and has been gradually used in simulating human body structures in the biomechanical field, especially more widely used in the research of thoracolumbar spine traumatology. This paper reviews the establishment of the thoracolumbar spine FEM, the verification of the FEM, and the thoracolumbar spine FEM research status in different fields, and discusses its prospects and values in forensic thoracolumbar traumatology.

  17. FIESTA ROC: A new finite element analysis program for solar cell simulation

    NASA Technical Reports Server (NTRS)

    Clark, Ralph O.

    1991-01-01

    The Finite Element Semiconductor Three-dimensional Analyzer by Ralph O. Clark (FIESTA ROC) is a computational tool for investigating in detail the performance of arbitrary solar cell structures. As its name indicates, it uses the finite element technique to solve the fundamental semiconductor equations in the cell. It may be used for predicting the performance (thereby dictating the design parameters) of a proposed cell or for investigating the limiting factors in an established design.

  18. New Techniques in Time-Frequency Analysis: Adaptive Band, Ultra-Wide Band and Multi-Rate Signal Processing

    DTIC Science & Technology

    2016-03-02

    Nyquist tiles and sampling groups in Euclidean geometry, and discussed the extension of these concepts to hyperbolic and spherical geometry and...hyperbolic or spherical spaces. We look to develop a structure for the tiling of frequency spaces in both Euclidean and non-Euclidean domains. In particular...we establish Nyquist tiles and sampling groups in Euclidean geometry, and discuss the extension of these concepts to hyperbolic and spherical geometry

  19. Resonant loading of aircraft secondary structure panels for use with thermoelastic stress analysis and digital image correlation

    NASA Astrophysics Data System (ADS)

    Waugh, Rachael C.; Dulieu-Barton, Janice M.; Quinn, S.

    2015-03-01

    Thermoelastic stress analysis (TSA) is an established active thermographic approach which uses the thermoelastic effect to correlate the temperature change that occurs as a material is subjected to elastic cyclic loading to the sum of the principal stresses on the surface of the component. Digital image correlation (DIC) tracks features on the surface of a material to establish a displacement field of a component subjected to load, which can then be used to calculate the strain field. The application of both DIC and TSA on a composite plate representative of aircraft secondary structure subject to resonant frequency loading using a portable loading device, i.e. `remote loading' is described. Laboratory based loading for TSA and DIC is typically imparted using a test machine, however in the current work a vibration loading system is used which is able to excite the component of interest at resonant frequency which enables TSA and DIC to be carried out. The accuracy of the measurements made under remote loading of both of the optical techniques applied is discussed. The data are compared to extract complimentary information from the two techniques. This work forms a step towards a combined strain based non-destructive evaluation procedure able to identify and quantify the effect of defects more fully, particularly when examining component performance in service applications.

  20. Chapter 7. Cloning and analysis of natural product pathways.

    PubMed

    Gust, Bertolt

    2009-01-01

    The identification of gene clusters of natural products has lead to an enormous wealth of information about their biosynthesis and its regulation, and about self-resistance mechanisms. Well-established routine techniques are now available for the cloning and sequencing of gene clusters. The subsequent functional analysis of the complex biosynthetic machinery requires efficient genetic tools for manipulation. Until recently, techniques for the introduction of defined changes into Streptomyces chromosomes were very time-consuming. In particular, manipulation of large DNA fragments has been challenging due to the absence of suitable restriction sites for restriction- and ligation-based techniques. The homologous recombination approach called recombineering (referred to as Red/ET-mediated recombination in this chapter) has greatly facilitated targeted genetic modifications of complex biosynthetic pathways from actinomycetes by eliminating many of the time-consuming and labor-intensive steps. This chapter describes techniques for the cloning and identification of biosynthetic gene clusters, for the generation of gene replacements within such clusters, for the construction of integrative library clones and their expression in heterologous hosts, and for the assembly of entire biosynthetic gene clusters from the inserts of individual library clones. A systematic approach toward insertional mutation of a complete Streptomyces genome is shown by the use of an in vitro transposon mutagenesis procedure.

  1. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures.

    PubMed

    Saleh, Sarah S; Lotfy, Hayam M; Hassan, Nagiba Y; Salem, Hesham

    2014-11-11

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Grid generation for the solution of partial differential equations

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Erlebacher, Gordon

    1989-01-01

    A general survey of grid generators is presented with a concern for understanding why grids are necessary, how they are applied, and how they are generated. After an examination of the need for meshes, the overall applications setting is established with a categorization of the various connectivity patterns. This is split between structured grids and unstructured meshes. Altogether, the categorization establishes the foundation upon which grid generation techniques are developed. The two primary categories are algebraic techniques and partial differential equation techniques. These are each split into basic parts, and accordingly are individually examined in some detail. In the process, the interrelations between the various parts are accented. From the established background in the primary techniques, consideration is shifted to the topic of interactive grid generation and then to adaptive meshes. The setting for adaptivity is established with a suitable means to monitor severe solution behavior. Adaptive grids are considered first and are followed by adaptive triangular meshes. Then the consideration shifts to the temporal coupling between grid generators and PDE-solvers. To conclude, a reflection upon the discussion, herein, is given.

  3. Grid generation for the solution of partial differential equations

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Erlebacher, Gordon

    1987-01-01

    A general survey of grid generators is presented with a concern for understanding why grids are necessary, how they are applied, and how they are generated. After an examination of the need for meshes, the overall applications setting is established with a categorization of the various connectivity patterns. This is split between structured grids and unstructured meshes. Altogether, the categorization establishes the foundation upon which grid generation techniques are developed. The two primary categories are algebraic techniques and partial differential equation techniques. These are each split into basic parts, and accordingly are individually examined in some detail. In the process, the interrelations between the various parts are accented. From the established background in the primary techniques, consideration is shifted to the topic of interactive grid generation and then to adaptive meshes. The setting for adaptivity is established with a suitable means to monitor severe solution behavior. Adaptive grids are considered first and are followed by adaptive triangular meshes. Then the consideration shifts to the temporal coupling between grid generators and PDE-solvers. To conclude, a reflection upon the discussion, herein, is given.

  4. An investigation into NVC characteristics of vehicle behaviour using modal analysis

    NASA Astrophysics Data System (ADS)

    Hanouf, Zahir; Faris, Waleed F.; Ahmad, Kartini

    2017-03-01

    NVC characterizations of vehicle behavior is one essential part of the development targets in automotive industries. Therefore understanding dynamic behavior of each structural part of the vehicle is a major requirement in improving the NVC characteristics of a vehicle. The main focus of this research is to investigate structural dynamic behavior of a passenger car using modal analysis part by part technique and apply this method to derive the interior noise sources. In the first part of this work computational modal analysis part by part tests were carried out to identify the dynamic parameters of the passenger car. Finite elements models of the different parts of the car are constructed using VPG 3.2 software. Ls-Dyna pre and post processing was used to identify and analyze the dynamic behavior of each car components panels. These tests had successfully produced natural frequencies and their associated mode shapes of such panels like trunk, hood, roof and door panels. In the second part of this research, experimental modal analysis part by part is performed on the selected car panels to extract modal parameters namely frequencies and mode shapes. The study establishes the step-by-step procedures to carry out experimental modal analysis on the car structures, using single input excitation and multi-output responses (SIMO) technique. To ensure the validity of the results obtained by the previous method an inverse method was done by fixing the response and moving the excitation and the results found were absolutely the same. Finally, comparison between results obtained from both analyses showed good similarity in both frequencies and mode shapes. Conclusion drawn from this part of study was that modal analysis part-by-part can be strongly used to establish the dynamic characteristics of the whole car. Furthermore, the developed method is also can be used to show the relationship between structural vibration of the car panels and the passengers’ noise comfort inside the cabin.

  5. Digital PCR quantification of MGMT methylation refines prediction of clinical benefit from alkylating agents in glioblastoma and metastatic colorectal cancer.

    PubMed

    Barault, L; Amatu, A; Bleeker, F E; Moutinho, C; Falcomatà, C; Fiano, V; Cassingena, A; Siravegna, G; Milione, M; Cassoni, P; De Braud, F; Rudà, R; Soffietti, R; Venesio, T; Bardelli, A; Wesseling, P; de Witt Hamer, P; Pietrantonio, F; Siena, S; Esteller, M; Sartore-Bianchi, A; Di Nicolantonio, F

    2015-09-01

    O(6)-methyl-guanine-methyl-transferase (MGMT) silencing by promoter methylation may identify cancer patients responding to the alkylating agents dacarbazine or temozolomide. We evaluated the prognostic and predictive value of MGMT methylation testing both in tumor and cell-free circulating DNA (cfDNA) from plasma samples using an ultra-sensitive two-step digital PCR technique (methyl-BEAMing). Results were compared with two established techniques, methylation-specific PCR (MSP) and Bs-pyrosequencing. Thresholds for MGMT methylated status for each technique were established in a training set of 98 glioblastoma (GBM) patients. The prognostic and the predictive value of MGMT methylated status was validated in a second cohort of 66 GBM patients treated with temozolomide in which methyl-BEAMing displayed a better specificity than the other techniques. Cutoff values of MGMT methylation specific for metastatic colorectal cancer (mCRC) tissue samples were established in a cohort of 60 patients treated with dacarbazine. In mCRC, both quantitative assays methyl-BEAMing and Bs-pyrosequencing outperformed MSP, providing better prediction of treatment response and improvement in progression-free survival (PFS) (P < 0.001). Ability of methyl-BEAMing to identify responding patients was validated in a cohort of 23 mCRC patients treated with temozolomide and preselected for MGMT methylated status according to MSP. In mCRC patients treated with dacarbazine, exploratory analysis of cfDNA by methyl-BEAMing showed that MGMT methylation was associated with better response and improved median PFS (P = 0.008). Methyl-BEAMing showed high reproducibility, specificity and sensitivity and was applicable to formalin-fixed paraffin-embedded tissues and cfDNA. This study supports the quantitative assessment of MGMT methylation for clinical purposes since it could refine prediction of response to alkylating agents. © The Author 2015. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. Effect of Processing Conditions on the Anelastic Behavior of Plasma Sprayed Thermal Barrier Coatings

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaishak

    2011-12-01

    Plasma sprayed ceramic materials contain an assortment of micro-structural defects, including pores, cracks, and interfaces arising from the droplet based assemblage of the spray deposition technique. The defective architecture of the deposits introduces a novel "anelastic" response in the coatings comprising of their non-linear and hysteretic stress-strain relationship under mechanical loading. It has been established that this anelasticity can be attributed to the relative movement of the embedded defects under varying stresses. While the non-linear response of the coatings arises from the opening/closure of defects, hysteresis is produced by the frictional sliding among defect surfaces. Recent studies have indicated that anelastic behavior of coatings can be a unique descriptor of their mechanical behavior and related to the defect configuration. In this dissertation, a multi-variable study employing systematic processing strategies was conducted to augment the understanding on various aspects of the reported anelastic behavior. A bi-layer curvature measurement technique was adapted to measure the anelastic properties of plasma sprayed ceramic. The quantification of anelastic parameters was done using a non-linear model proposed by Nakamura et.al. An error analysis was conducted on the technique to know the available margins for both experimental as well as computational errors. The error analysis was extended to evaluate its sensitivity towards different coating microstructure. For this purpose, three coatings with significantly different microstructures were fabricated via tuning of process parameters. Later the three coatings were also subjected to different strain ranges systematically, in order to understand the origin and evolution of anelasticity on different microstructures. The last segment of this thesis attempts to capture the intricacies on the processing front and tries to evaluate and establish a correlation between them and the anelastic parameters.

  7. Particle Tracking Facilitates Real Time Capable Motion Correction in 2D or 3D Two-Photon Imaging of Neuronal Activity.

    PubMed

    Aghayee, Samira; Winkowski, Daniel E; Bowen, Zachary; Marshall, Erin E; Harrington, Matt J; Kanold, Patrick O; Losert, Wolfgang

    2017-01-01

    The application of 2-photon laser scanning microscopy (TPLSM) techniques to measure the dynamics of cellular calcium signals in populations of neurons is an extremely powerful technique for characterizing neural activity within the central nervous system. The use of TPLSM on awake and behaving subjects promises new insights into how neural circuit elements cooperatively interact to form sensory perceptions and generate behavior. A major challenge in imaging such preparations is unavoidable animal and tissue movement, which leads to shifts in the imaging location (jitter). The presence of image motion can lead to artifacts, especially since quantification of TPLSM images involves analysis of fluctuations in fluorescence intensities for each neuron, determined from small regions of interest (ROIs). Here, we validate a new motion correction approach to compensate for motion of TPLSM images in the superficial layers of auditory cortex of awake mice. We use a nominally uniform fluorescent signal as a secondary signal to complement the dynamic signals from genetically encoded calcium indicators. We tested motion correction for single plane time lapse imaging as well as multiplane (i.e., volume) time lapse imaging of cortical tissue. Our procedure of motion correction relies on locating the brightest neurons and tracking their positions over time using established techniques of particle finding and tracking. We show that our tracking based approach provides subpixel resolution without compromising speed. Unlike most established methods, our algorithm also captures deformations of the field of view and thus can compensate e.g., for rotations. Object tracking based motion correction thus offers an alternative approach for motion correction, one that is well suited for real time spike inference analysis and feedback control, and for correcting for tissue distortions.

  8. Particle Tracking Facilitates Real Time Capable Motion Correction in 2D or 3D Two-Photon Imaging of Neuronal Activity

    PubMed Central

    Aghayee, Samira; Winkowski, Daniel E.; Bowen, Zachary; Marshall, Erin E.; Harrington, Matt J.; Kanold, Patrick O.; Losert, Wolfgang

    2017-01-01

    The application of 2-photon laser scanning microscopy (TPLSM) techniques to measure the dynamics of cellular calcium signals in populations of neurons is an extremely powerful technique for characterizing neural activity within the central nervous system. The use of TPLSM on awake and behaving subjects promises new insights into how neural circuit elements cooperatively interact to form sensory perceptions and generate behavior. A major challenge in imaging such preparations is unavoidable animal and tissue movement, which leads to shifts in the imaging location (jitter). The presence of image motion can lead to artifacts, especially since quantification of TPLSM images involves analysis of fluctuations in fluorescence intensities for each neuron, determined from small regions of interest (ROIs). Here, we validate a new motion correction approach to compensate for motion of TPLSM images in the superficial layers of auditory cortex of awake mice. We use a nominally uniform fluorescent signal as a secondary signal to complement the dynamic signals from genetically encoded calcium indicators. We tested motion correction for single plane time lapse imaging as well as multiplane (i.e., volume) time lapse imaging of cortical tissue. Our procedure of motion correction relies on locating the brightest neurons and tracking their positions over time using established techniques of particle finding and tracking. We show that our tracking based approach provides subpixel resolution without compromising speed. Unlike most established methods, our algorithm also captures deformations of the field of view and thus can compensate e.g., for rotations. Object tracking based motion correction thus offers an alternative approach for motion correction, one that is well suited for real time spike inference analysis and feedback control, and for correcting for tissue distortions. PMID:28860973

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casperson, R. J.; Burke, J. T.; Hughes, R. O.

    Directly measuring (n,2n) cross sections on short-lived actinides presents a number of experimental challenges. The surrogate reaction technique is an experimental method for measuring cross sections on short-­lived isotopes, and it provides a unique solution for measuring (n,2n) cross sections. This technique involves measuring a charged-­particle reaction cross section, where the reaction populates the same compound nucleus as the reaction of interest. To perform these surrogate (n,2n) cross section measurements, a silicon telescope array has been placed along a beam line at the Texas A&M University Cyclotron Institute, which is surrounded by a large tank of gadolinium-doped liquid scintillator, whichmore » acts as a neutron detector. The combination of the charge-particle and neutron-detector arrays is referred to as NeutronSTARS. In the analysis procedure for calculating the (n,2n) cross section, the neutron detection efficiency and time structure plays an important role. Due to the lack of availability of isotropic, mono-energetic neutron sources, modeling is an important component in establishing this efficiency and time structure. This report describes the NeutronSTARS array, which was designed and commissioned during this project. It also describes the surrogate reaction technique, specifically referencing a 235U(n,2n) commissioning measurement that was fielded during the past year. Advanced multiplicity analysis techniques have been developed for this work, which should allow for efficient analysis of 241Pu(n,2n) and 239Pu(n,2n) cross section measurements« less

  10. Enhanced characterization of singly protonated phosphopeptide ions by femtosecond laser-induced ionization/dissociation tandem mass spectrometry (fs-LID-MS/MS).

    PubMed

    Smith, Scott A; Kalcic, Christine L; Safran, Kyle A; Stemmer, Paul M; Dantus, Marcos; Reid, Gavin E

    2010-12-01

    To develop an improved understanding of the regulatory role that post-translational modifications (PTMs) involving phosphorylation play in the maintenance of normal cellular function, tandem mass spectrometry (MS/MS) strategies coupled with ion activation techniques such as collision-induced dissociation (CID) and electron-transfer dissociation (ETD) are typically employed to identify the presence and site-specific locations of the phosphate moieties within a given phosphoprotein of interest. However, the ability of these techniques to obtain sufficient structural information for unambiguous phosphopeptide identification and characterization is highly dependent on the ion activation method employed and the properties of the precursor ion that is subjected to dissociation. Herein, we describe the application of a recently developed alternative ion activation technique for phosphopeptide analysis, termed femtosecond laser-induced ionization/dissociation (fs-LID). In contrast to CID and ETD, fs-LID is shown to be particularly suited to the analysis of singly protonated phosphopeptide ions, yielding a wide range of product ions including a, b, c, x, y, and z sequence ions, as well as ions that are potentially diagnostic of the positions of phosphorylation (e.g., 'a(n)+1-98'). Importantly, the lack of phosphate moiety losses or phosphate group 'scrambling' provides unambiguous information for sequence identification and phosphorylation site characterization. Therefore, fs-LID-MS/MS can serve as a complementary technique to established methodologies for phosphoproteomic analysis. Copyright © 2010. Published by Elsevier Inc.

  11. Angular photogrammetric analysis of the soft-tissue facial profile of Indian adults.

    PubMed

    Pandian, K Saravana; Krishnan, Sindhuja; Kumar, S Aravind

    2018-01-01

    Soft-tissue analysis has become an important component of orthodontic diagnosis and treatment planning. Photographic evaluation of an orthodontic patient is a very close representation of the appearance of the person. The previously established norms for soft-tissue analysis will vary for different ethnic groups. Thus, there is a need to develop soft-tissue facial profile norms pertaining to Indian ethnic groups. The aim of this study is to establish the angular photogrammetric standards of soft-tissue facial profile for Indian males and females and also to compare sexual dimorphism present between them. The lateral profile photographs of 300 random participants (150 males and 150 females) between ages 18 and 25 years were taken and analyzed using FACAD tracing software. Inclusion criteria were angles Class I molar occlusion with acceptable crowding and proclination, normal growth and development with well-aligned dental arches, and full complements of permanent teeth irrespective of third molar status. This study was conducted in Indian population, and samples were taken from various cities across India. Descriptive statistical analysis was carried out, and sexual dimorphism was evaluated by Student's t-test between males and females. The results of the present study showed statistically significant (P < 0.05) gender difference in 5 parameters out of 12 parameters in Indian population. In the present study, soft-tissue facial measurements were established by means of photogrammetric analysis to facilitate orthodontists to carry out more quantitative evaluation and make disciplined decisions. The mean values obtained can be used for comparison with records of participants with the same characteristics by following this photogrammetric technique.

  12. Solar Cell Calibration and Measurement Techniques

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila; Brinker, Dave; Curtis, Henry; Jenkins, Phillip; Scheiman, Dave

    1997-01-01

    The increasing complexity of space solar cells and the increasing international markets for both cells and arrays has resulted in workshops jointly sponsored by NASDA, ESA and NASA. These workshops are designed to obtain international agreement on standardized values for the AMO spectrum and constant, recommend laboratory measurement practices and establish a set of protocols for international comparison of laboratory measurements. A working draft of an ISO standard, WDI 5387, 'Requirements for Measurement and Calibration Procedures for Space Solar Cells' was discussed with a focus on the scope of the document, a definition of primary standard cell, and required error analysis for all measurement techniques. Working groups addressed the issues of Air Mass Zero (AMO) solar constant and spectrum, laboratory measurement techniques, and the international round robin methodology. A summary is presented of the current state of each area and the formulation of the ISO document.

  13. Solar Cell Calibration and Measurement Techniques

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila; Brinker, Dave; Curtis, Henry; Jenkins, Phillip; Scheiman, Dave

    2004-01-01

    The increasing complexity of space solar cells and the increasing international markets for both cells and arrays has resulted in workshops jointly sponsored by NASDA, ESA and NASA. These workshops are designed to obtain international agreement on standardized values for the AMO spectrum and constant, recommend laboratory measurement practices and establish a set of protocols for international comparison of laboratory measurements. A working draft of an ISO standard, WD15387, "Requirements for Measurement and Calibration Procedures for Space Solar Cells" was discussed with a focus on the scope of the document, a definition of primary standard cell, and required error analysis for all measurement techniques. Working groups addressed the issues of Air Mass Zero (AMO) solar constant and spectrum, laboratory measurement techniques, and te international round robin methodology. A summary is presented of the current state of each area and the formulation of the ISO document.

  14. Machine learning, medical diagnosis, and biomedical engineering research - commentary.

    PubMed

    Foster, Kenneth R; Koprowski, Robert; Skufca, Joseph D

    2014-07-05

    A large number of papers are appearing in the biomedical engineering literature that describe the use of machine learning techniques to develop classifiers for detection or diagnosis of disease. However, the usefulness of this approach in developing clinically validated diagnostic techniques so far has been limited and the methods are prone to overfitting and other problems which may not be immediately apparent to the investigators. This commentary is intended to help sensitize investigators as well as readers and reviewers of papers to some potential pitfalls in the development of classifiers, and suggests steps that researchers can take to help avoid these problems. Building classifiers should be viewed not simply as an add-on statistical analysis, but as part and parcel of the experimental process. Validation of classifiers for diagnostic applications should be considered as part of a much larger process of establishing the clinical validity of the diagnostic technique.

  15. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  16. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  17. Auxiliary principle technique and iterative algorithm for a perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems.

    PubMed

    Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais

    2017-01-01

    In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.

  18. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    NASA Astrophysics Data System (ADS)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  19. Laser induced autofluorescence for diagnosis of non-melanoma skin cancer

    NASA Astrophysics Data System (ADS)

    Drakaki, E.; Makropoulou, M.; Serafetinides, A. A.; Merlemis, N.; Kalatzis, I.; Sianoudis, I. A.; Batsi, O.; Christofidou, E.; Stratigos, A. J.; Katsambas, A. D.; Antoniou, Ch.

    2015-01-01

    Non melanoma skin cancer is one of the most frequent malignant tumors among humans. A non-invasive technique, with high sensitivity and high specificity, would be the most suitable method for basal cell carcinoma (BCC) or other malignancies diagnostics, instead of the well established biopsy and histopathology examination. In the last decades, a non-invasive, spectroscopic diagnostic method was introduced, the laser induced fluorescence (LIF), which could generate an image contrast between different states of skin tissue. The noninvasiveness consists in that this biophotonic method do not require tissue sample excision, what is necessary in histopathology characterization and biochemical analysis of the skin tissue samples, which is worldwide used as an evaluation gold standard. The object of this study is to establish the possibilities of a relatively portable system for laser induced skin autofluorescence to differentiate malignant from nonmalignant skin lesions. Unstained human skin samples, excised from humans undergoing biopsy examination, were irradiated with a Nd:YAG-3ω laser (λ=355 nm, 6 ns), used as an excitation source for the autofluorescence measurements. A portable fiber-based spectrometer was used to record fluorescence spectra of the sites of interest. The ex vivo results, obtained with this spectroscopic technique, were correlated with the histopathology results. After the analysis of the fluorescence spectra of almost 60 skin tissue areas, we developed an algorithm to distinguish different types of malignant lesions, including inflammatory areas. Optimization of the data analysis and potential use of LIF spectroscopy with 355 nm Nd:YAG laser excitation of tissue autofluorescence for clinical applications are discussed.

  20. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  1. The Evolving MCART Multimodal Imaging Core: Establishing a protocol for Computed Tomography and Echocardiography in the Rhesus macaque to perform longitudinal analysis of radiation-induced organ injury

    PubMed Central

    de Faria, Eduardo B.; Barrow, Kory R.; Ruehle, Bradley T.; Parker, Jordan T.; Swartz, Elisa; Taylor-Howell, Cheryl; Kieta, Kaitlyn M.; Lees, Cynthia J.; Sleeper, Meg M.; Dobbin, Travis; Baron, Adam D.; Mohindra, Pranshu; MacVittie, Thomas J.

    2015-01-01

    Computed Tomography (CT) and Echocardiography (EC) are two imaging modalities that produce critical longitudinal data that can be analyzed for radiation-induced organ-specific injury to the lung and heart. The Medical Countermeasures Against Radiological Threats (MCART) consortium has a well-established animal model research platform that includes nonhuman primate (NHP) models of the acute radiation syndrome and the delayed effects of acute radiation exposure. These models call for a definition of the latency, incidence, severity, duration, and resolution of different organ-specific radiation-induced subsyndromes. The pulmonary subsyndromes and cardiac effects are a pair of inter-dependent syndromes impacted by exposure to potentially lethal doses of radiation. Establishing a connection between these will reveal important information about their interaction and progression of injury and recovery. Herein, we demonstrate the use of CT and EC data in the rhesus macaque models to define delayed organ injury thereby establishing: a) consistent and reliable methodology to assess radiation-induced damage to the lung and heart, b) an extensive database in normal age-matched NHP for key primary and secondary endpoints, c) identified problematic variables in imaging techniques and proposed solutions to maintain data integrity and d) initiated longitudinal analysis of potentially lethal radiation-induced damage to the lung and heart. PMID:26425907

  2. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  3. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  4. Evaluation of Fractional Regional Ventilation Using 4D-CT and Effects of Breathing Maneuvers on Ventilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mistry, Nilesh N., E-mail: nmistry@som.umaryland.edu; Diwanji, Tejan; Shi, Xiutao

    2013-11-15

    Purpose: Current implementations of methods based on Hounsfield units to evaluate regional lung ventilation do not directly incorporate tissue-based mass changes that occur over the respiratory cycle. To overcome this, we developed a 4-dimensional computed tomography (4D-CT)-based technique to evaluate fractional regional ventilation (FRV) that uses an individualized ratio of tidal volume to end-expiratory lung volume for each voxel. We further evaluated the effect of different breathing maneuvers on regional ventilation. The results from this work will help elucidate the relationship between global and regional lung function. Methods and Materials: Eight patients underwent 3 sets of 4D-CT scans during 1more » session using free-breathing, audiovisual guidance, and active breathing control. FRV was estimated using a density-based algorithm with mass correction. Internal validation between global and regional ventilation was performed by use of the imaging data collected during the use of active breathing control. The impact of breathing maneuvers on FRV was evaluated comparing the tidal volume from 3 breathing methods. Results: Internal validation through comparison between the global and regional changes in ventilation revealed a strong linear correlation (slope of 1.01, R{sup 2} of 0.97) between the measured global lung volume and the regional lung volume calculated by use of the “mass corrected” FRV. A linear relationship was established between the tidal volume measured with the automated breathing control system and FRV based on 4D-CT imaging. Consistently larger breathing volumes were observed when coached breathing techniques were used. Conclusions: The technique presented improves density-based evaluation of lung ventilation and establishes a link between global and regional lung ventilation volumes. Furthermore, the results obtained are comparable with those of other techniques of functional evaluation such as spirometry and hyperpolarized-gas magnetic resonance imaging. These results were demonstrated on retrospective analysis of patient data, and further research using prospective data is under way to validate this technique against established clinical tests.« less

  5. Boiler Tube Corrosion Characterization with a Scanning Thermal Line

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas

    2001-01-01

    Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.

  6. Development of an automation technique for the establishment of functional lipid bilayer arrays

    NASA Astrophysics Data System (ADS)

    Hansen, J. S.; Perry, M.; Vogel, J.; Vissing, T.; Hansen, C. R.; Geschke, O.; Emnéus, J.; Nielsen, C. H.

    2009-02-01

    In the present work, a technique for establishing multiple black lipid membranes (BLMs) in arrays of micro structured ethylene tetrafluoroethylene (ETFE) films, and supported by a micro porous material was developed. Rectangular 8 × 8 arrays with apertures having diameters of 301 ± 5 µm were fabricated in ETFE Teflon film by laser ablation using a carbon dioxide laser. Multiple lipid membranes could be formed across the micro structured 8 × 8 array ETFE partitions. Success rates for the establishment of cellulose-supported BLMs across the multiple aperture arrays were above 95%. However, the time course of the membrane thinning process was found to vary considerably between multiple aperture bilayer experiments. An airbrush partition pretreatment technique was developed to increase the reproducibility of the multiple lipid bilayers formation during the time course from the establishment of the lipid membranes to the formation of bilayers. The results showed that multiple lipid bilayers could be reproducible formed across the airbrush-pretreated 8 × 8 rectangular arrays. The ionophoric peptide valinomycin was incorporated into established membrane arrays, resulting in ionic currents that could be effectively blocked by tetraethylammonium. This shows that functional bimolecular lipid membranes were established, and furthermore outlines that the established lipid membrane arrays could host functional membrane-spanning molecules.

  7. Real-time continuous visual biofeedback in the treatment of speech breathing disorders following childhood traumatic brain injury: report of one case.

    PubMed

    Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C

    1999-01-01

    The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.

  8. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    NASA Technical Reports Server (NTRS)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  9. A Systematic Review of Techniques and Sources of Big Data in the Healthcare Sector.

    PubMed

    Alonso, Susel Góngora; de la Torre Díez, Isabel; Rodrigues, Joel J P C; Hamrioui, Sofiane; López-Coronado, Miguel

    2017-10-14

    The main objective of this paper is to present a review of existing researches in the literature, referring to Big Data sources and techniques in health sector and to identify which of these techniques are the most used in the prediction of chronic diseases. Academic databases and systems such as IEEE Xplore, Scopus, PubMed and Science Direct were searched, considering the date of publication from 2006 until the present time. Several search criteria were established as 'techniques' OR 'sources' AND 'Big Data' AND 'medicine' OR 'health', 'techniques' AND 'Big Data' AND 'chronic diseases', etc. Selecting the paper considered of interest regarding the description of the techniques and sources of Big Data in healthcare. It found a total of 110 articles on techniques and sources of Big Data on health from which only 32 have been identified as relevant work. Many of the articles show the platforms of Big Data, sources, databases used and identify the techniques most used in the prediction of chronic diseases. From the review of the analyzed research articles, it can be noticed that the sources and techniques of Big Data used in the health sector represent a relevant factor in terms of effectiveness, since it allows the application of predictive analysis techniques in tasks such as: identification of patients at risk of reentry or prevention of hospital or chronic diseases infections, obtaining predictive models of quality.

  10. Monitoring the defoliation of hardwood forests in Pennsylvania using LANDSAT. [gypsy moth surveys

    NASA Technical Reports Server (NTRS)

    Dottavio, C. L.; Nelson, R. F.; Williams, D. L. (Principal Investigator)

    1983-01-01

    An automated system for conducting annual gypsy moth defoliation surveys using LANDSAT MSS data and digital processing techniques is described. A two-step preprocessing procedure was developed that uses multitemporal data sets representing forest canopy conditions before and after defoliation to create a digital image in which all nonforest cover types are eliminated or masked out of a LANDSAT image that exhibits insect defoliation. A temporal window for defoliation assessment was identified and a statewide data base was established. A data management system to interface image analysis software with the statewide data base was developed and a cost benefit analysis of this operational system was conducted.

  11. An investigation of correlation between pilot scanning behavior and workload using stepwise regression analysis

    NASA Technical Reports Server (NTRS)

    Waller, M. C.

    1976-01-01

    An electro-optical device called an oculometer which tracks a subject's lookpoint as a time function has been used to collect data in a real-time simulation study of instrument landing system (ILS) approaches. The data describing the scanning behavior of a pilot during the instrument approaches have been analyzed by use of a stepwise regression analysis technique. A statistically significant correlation between pilot workload, as indicated by pilot ratings, and scanning behavior has been established. In addition, it was demonstrated that parameters derived from the scanning behavior data can be combined in a mathematical equation to provide a good representation of pilot workload.

  12. Characterization and classification of oral tissues using excitation and emission matrix: a statistical modeling approach

    NASA Astrophysics Data System (ADS)

    Kanniyappan, Udayakumar; Gnanatheepaminstein, Einstein; Prakasarao, Aruna; Dornadula, Koteeswaran; Singaravelu, Ganesan

    2017-02-01

    Cancer is one of the most common human threats around the world and diagnosis based on optical spectroscopy especially fluorescence technique has been established as the standard approach among scientist to explore the biochemical and morphological changes in tissues. In this regard, the present work aims to extract spectral signatures of the various fluorophores present in oral tissues using parallel factor analysis (PARAFAC). Subsequently, the statistical analysis also to be performed to show its diagnostic potential in distinguishing malignant, premalignant from normal oral tissues. Hence, the present study may lead to the possible and/or alternative tool for oral cancer diagnosis.

  13. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  14. Direct mass spectrometry approaches to characterize polyphenol composition of complex samples.

    PubMed

    Fulcrand, Hélène; Mané, Carine; Preys, Sébastien; Mazerolles, Gérard; Bouchut, Claire; Mazauric, Jean-Paul; Souquet, Jean-Marc; Meudec, Emmanuelle; Li, Yan; Cole, Richard B; Cheynier, Véronique

    2008-12-01

    Lower molecular weight polyphenols including proanthocyanidin oligomers can be analyzed after HPLC separation on either reversed-phase or normal phase columns. However, these techniques are time consuming and can have poor resolution as polymer chain length and structural diversity increase. The detection of higher molecular weight compounds, as well as the determination of molecular weight distributions, remain major challenges in polyphenol analysis. Approaches based on direct mass spectrometry (MS) analysis that are proposed to help overcome these problems are reviewed. Thus, direct flow injection electrospray ionization mass spectrometry analysis can be used to establish polyphenol fingerprints of complex extracts such as in wine. This technique enabled discrimination of samples on the basis of their phenolic (i.e. anthocyanin, phenolic acid and flavan-3-ol) compositions, but larger oligomers and polymers were poorly detectable. Detection of higher molecular weight proanthocyanidins was also restricted with matrix-assisted laser desorption ionization (MALDI) MS, suggesting that they are difficult to desorb as gas-phase ions. The mass distribution of polymeric fractions could, however, be determined by analyzing the mass distributions of bovine serum albumin/proanthocyanidin complexes using MALDI-TOF-MS.

  15. Progress on Fault Mechanisms for Gear Transmissions in Coal Cutting Machines: From Macro to Nano Models.

    PubMed

    Jiang, Yu; Zhang, Xiaogang; Zhang, Chao; Li, Zhixiong; Sheng, Chenxing

    2017-04-01

    Numerical modeling has been recognized as the dispensable tools for mechanical fault mechanism analysis. Techniques, ranging from macro to nano levels, include the finite element modeling boundary element modeling, modular dynamic modeling, nano dynamic modeling and so forth. This work firstly reviewed the progress on the fault mechanism analysis for gear transmissions from the tribological and dynamic aspects. Literature review indicates that the tribological and dynamic properties were separately investigated to explore the fault mechanism in gear transmissions. However, very limited work has been done to address the links between the tribological and dynamic properties and scarce researches have been done for coal cutting machines. For this reason, the tribo-dynamic coupled model was introduced to bridge the gap between the tribological and dynamic models in fault mechanism analysis for gear transmissions in coal cutting machines. The modular dynamic modeling and nano dynamic modeling techniques are expected to establish the links between the tribological and dynamic models. Possible future research directions using the tribo dynamic coupled model were summarized to provide potential references for researchers in the field.

  16. Achievements and perspectives of top-down proteomics.

    PubMed

    Armirotti, Andrea; Damonte, Gianluca

    2010-10-01

    Over the last years, top-down (TD) MS has gained a remarkable space in proteomics, rapidly trespassing the limit between a promising approach and a solid, established technique. Several research groups worldwide have implemented TD analysis in their routine work on proteomics, deriving structural information on proteins with the level of accuracy that is impossible to achieve with classical bottom-up approaches. Complete maps of PTMs and assessment of single aminoacid polymorphisms are only a few of the results that can be obtained with this technique. Despite some existing technical and economical limitations, TD analysis is at present the most powerful instrument for MS-based proteomics and its implementation in routine workflow is a rapidly approaching turning point in proteomics. In this review article, the state-of-the-art of TD approach is described along with its major advantages and drawbacks and the most recent trends in TD analysis are discussed. References for all the covered topics are reported in the text, with the aim to support both newcomers and mass spectrometrists already introduced to TD proteomics.

  17. Alchemy in the underworld - recent progress and future potential of organic geochemistry applied to speleothems.

    NASA Astrophysics Data System (ADS)

    Blyth, Alison

    2016-04-01

    Speleothems are well used archives for chemical records of terrestrial environmental change, and the integration of records from a range of isotopic, inorganic, and organic geochemical techniques offers significant power in reconstructing both changes in past climates and identifying the resultant response in the overlying terrestrial ecosystems. The use of organic geochemistry in this field offers the opportunity to recover new records of vegetation change (via biomarkers and compound specific isotopes), temperature change (via analysis of glycerol dialkyl glycerol tetraethers, a compound group derived from microbes and varying in structure in response to temperature and pH), and changes in soil microbial behaviour (via combined carbon isotope analysis). However, to date the use of organic geochemical techniques has been relatively limited, due to issues relating to sample size, concerns about contamination, and unanswered questions about the origins of the preserved organic matter and rates of transport. Here I will briefly review recent progress in the field, and present a framework for the future research needed to establish organic geochemical analysis in speleothems as a robust palaeo-proxy approach.

  18. Micro-thermocouple probe for measurement of cellular thermal responses.

    PubMed

    Watanabe, M; Kakuta, N; Mabuchi, K; Yamada, Y

    2005-01-01

    We have produced micro-thermocouple probes for the measurement of cellular thermal responses. Cells generate heat with their metabolisms and more heat with reactions to a certain physical or chemical stimulation. Therefore, the analysis of the cellular thermal responses would provide new physiological information. However, a real-time thermal measurement technique on a target of a single cell has not been established. In this study, glass micropipettes, which are widely used in bioengineering and medicine, are used for the base of the thermocouple probes. Using microfabrication techniques, the junction of two different metal films is formed at the micropipette tip with a diameter of 1 μm. This probe can inject a chemical substance into a cell and to detect its subsequent temperature changes simultaneously.

  19. The magnifying glass - A feature space local expansion for visual analysis. [and image enhancement

    NASA Technical Reports Server (NTRS)

    Juday, R. D.

    1981-01-01

    The Magnifying Glass Transformation (MGT) technique is proposed, as a multichannel spectral operation yielding visual imagery which is enhanced in a specified spectral vicinity, guided by the statistics of training samples. An application example is that in which the discrimination among spectral neighbors within an interactive display may be increased without altering distant object appearances or overall interpretation. A direct histogram specification technique is applied to the channels within the multispectral image so that a subset of the spectral domain occupies an increased fraction of the domain. The transformation is carried out by obtaining the training information, establishing the condition of the covariance matrix, determining the influenced solid, and initializing the lookup table. Finally, the image is transformed.

  20. Helicopter flight test demonstration of differential GPS

    NASA Technical Reports Server (NTRS)

    Denaro, R. P.; Beser, J.

    1985-01-01

    An off-line post-mission processing facility is being established by NASA Ames Research Center to analyze differential GPS flight tests. The current and future differential systems are described, comprising an airborne segment in an SH-3 helicopter, a GPS ground reference station, and a tracking system. The post-mission processing system provides for extensive measurement analysis and differential computation. Both differential range residual corrections and navigation corrections are possible. Some preliminary flight tests were conducted in a landing approach scenario and statically. Initial findings indicate the possible need for filter matching between airborne and ground systems (if used in a navigation correction technique), the advisability of correction smoothing before airborne incorporation, and the insensitivity of accuracy to either of the differential techniques or to update rates.

  1. In situ focus characterization by ablation technique to enable optics alignment at an XUV FEL source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerasimova, N.; Dziarzhytski, S.; Weigelt, H.

    2013-06-15

    In situ focus characterization is demonstrated by working at an extreme ultraviolet (XUV) free-electron laser source using ablation technique. Design of the instrument reported here allows reaching a few micrometres resolution along with keeping the ultrahigh vacuum conditions and ensures high-contrast visibility of ablative imprints on optically transparent samples, e.g., PMMA. This enables on-line monitoring of the beam profile changes and thus makes possible in situ alignment of the XUV focusing optics. A good agreement between focal characterizations retrieved from in situ inspection of ablative imprints contours and from well-established accurate ex situ analysis with Nomarski microscope has been observedmore » for a typical micro-focus experiment.« less

  2. Detonation velocity in poorly mixed gas mixtures

    NASA Astrophysics Data System (ADS)

    Prokhorov, E. S.

    2017-10-01

    The technique for computation of the average velocity of plane detonation wave front in poorly mixed mixture of gaseous hydrocarbon fuel and oxygen is proposed. Here it is assumed that along the direction of detonation propagation the chemical composition of the mixture has periodic fluctuations caused, for example, by layered stratification of gas charge. The technique is based on the analysis of functional dependence of ideal (Chapman-Jouget) detonation velocity on mole fraction (with respect to molar concentration) of the fuel. It is shown that the average velocity of detonation can be significantly (by more than 10%) less than the velocity of ideal detonation. The dependence that permits to estimate the degree of mixing of gas mixture basing on the measurements of average detonation velocity is established.

  3. New Techniques for the Generation and Analysis of Tailored Microbial Systems on Surfaces.

    PubMed

    Furst, Ariel L; Smith, Matthew J; Francis, Matthew B

    2018-05-17

    The interactions between microbes and surfaces provide critically important cues that control the behavior and growth of the cells. As our understanding of complex microbial communities improves, there is a growing need for experimental tools that can establish and control the spatial arrangements of these cells in a range of contexts. Recent improvements in methods to attach bacteria and yeast to nonbiological substrates, combined with an expanding set of techniques available to study these cells, position this field for many new discoveries. Improving methods for controlling the immobilization of bacteria provides powerful experimental tools for testing hypotheses regarding microbiome interactions, studying the transfer of nutrients between bacterial species, and developing microbial communities for green energy production and pollution remediation.

  4. Application Of Laser Induced Breakdown Spectroscopy (LIBS) Technique In Investigation Of Historical Metal Threads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Kareem, O.; Khedr, A.; Abdelhamid, M.

    Analysis of the composition of an object is a necessary step in the documentation of the properties of this object for estimating its condition. Also this is an important task for establishing an appropriate conservation treatment of an object or to follow up the result of the application of the suggested treatments. There has been an important evolution in the methods used for analysis of metal threads since the second half of the twentieth century. Today, the main considerations of selecting a method are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to themore » appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal threads. In this study various historical metal threads collected from different museums were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal thread samples were investigated with Scanning Electron Microscope (SEM) with energy-dispersive x-ray analyzer (EDX) which is reported in conservation field as the best method, to determine the chemical composition, and corrosion of investigated metal threads. The results show that all investigated metal threads in the present study are too dirty, strongly damaged and corroded with different types of corrosion products. Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal threads. It is, in fact, very useful tool as a noninvasive method for analysis of historical metal threads. The first few laser shots are very useful for the investigation of the corrosion and dirt layer, while the following shots are very useful and effective for investigating the coating layer. Higher number of laser shots are very useful for the main composition of the metal thread. There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal threads.« less

  5. Method for establishing the presence of salmonella bacteria in eggs

    DOEpatents

    Johnston, Roger G.; Sinha, Dipen N.

    1995-01-01

    Measurement of the acoustical resonances in eggs is shown to provide a rapid, noninvasive technique for establishing the presence of Salmonella bacteria. The technique is also sensitive to yolk puncture, shell cracks, and may be sensitive to other yolk properties and to egg freshness. Remote characterization, potentially useful for characterizing large numbers of eggs, has been demonstrated.

  6. 15 CFR 923.42 - State establishment of criteria and standards for local implementation-Technique A.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DEPARTMENT OF COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM REGULATIONS... specified in subsection 306(d)(11) of the Act for control of land uses and water uses within the coastal zone. The first such control technique, at subsection 306(d)(11)(A) of the Act, is state establishment...

  7. Saturn S-2 production operations techniques: Production welding. Volume 1: Bulkhead welding

    NASA Technical Reports Server (NTRS)

    Abel, O. G.

    1970-01-01

    The complex Saturn S-2 welding processes and procedures required considerable development and refinement to establish a production capability that could consistently produce aluminum alloy welds within specified requirements. The special processes and techniques are defined that were established for the welding of gore-to-gore and manhole- or closeout-to-gore.

  8. Robotic nephroureterectomy: a simplified approach requiring no patient repositioning or robot redocking.

    PubMed

    Zargar, Homayoun; Krishnan, Jayram; Autorino, Riccardo; Akca, Oktay; Brandao, Luis Felipe; Laydner, Humberto; Samarasekera, Dinesh; Ko, Oliver; Haber, Georges-Pascal; Kaouk, Jihad H; Stein, Robert J

    2014-10-01

    Robotic technology is increasingly adopted in urologic surgery and a variety of techniques has been described for minimally invasive treatment of upper tract urothelial cancer (UTUC). To describe a simplified surgical technique of robot-assisted nephroureterectomy (RANU) and to report our single-center surgical outcomes. Patients with history of UTUC treated with this modality between April 2010 and August 2013 were included in the analysis. Institutional review board approval was obtained. Informed consent was signed by all patients. A simplified single-step RANU not requiring repositioning or robot redocking. Lymph node dissection was performed selectively. Descriptive analysis of patients' characteristics, perioperative outcomes, histopathology, and short-term follow-up data was performed. The analysis included 31 patients (mean age: 72.4±10.6 yr; mean body mass index: 26.6±5.1kg/m(2)). Twenty-six of 30 tumors (86%) were high grade. Mean tumor size was 3.1±1.8cm. Of the 31 patients, 13 (42%) had pT3 stage disease. One periureteric positive margin was noted in a patient with bulky T3 disease. The mean number of lymph nodes removed was 9.4 (standard deviation: 5.6; range: 3-21). Two of 14 patients (14%) had positive lymph nodes on final histology. No patients required a blood transfusion. Six patients experienced complications postoperatively, with only one being a high grade (Clavien 3b) complication. Median hospital stay was 5 d. Within the follow-up period, seven patients experienced bladder recurrences and four patients developed metastatic disease. Our RANU technique eliminates the need for patient repositioning or robot redocking. This technique can be safely reproduced, with surgical outcomes comparable to other established techniques. We describe a surgical technique using the da Vinci robot for a minimally invasive treatment of patients presenting with upper tract urothelial cancer. This technique can be safely implemented with good surgical outcomes. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  9. AMS Radiocarbon Dating Individual Taxa and Individual Specimens: Implications for Small Mammal Paleoecology.

    NASA Astrophysics Data System (ADS)

    Graham, Russell; Stafford, Thomas, Jr.; Semken, Holmes, Jr.

    2010-05-01

    Advances in AMS physics and organic geochemistry have revolutionized our ability to establish absolute chronologies on vertebrate fossils. Highly purified collagen, which provides extremely accurate 14C ages, can be extracted from single bones and teeth as small as 50 mg. Combined with measurement precisions of ±15 to 25 years for ages of < 20,000 yr, the direct AMS 14C technique enables fossil deposits to be chronologically dissected at the level of single animals. Analysis of data from a variety of sites in the United States indicates that most excavation levels (analysis units) as small as 10 cm can be time averaged by several thousand years at a minimum, even with the greatest care in excavation and processing of sediments. Time averaging of this magnitude has important implications for fine-scale paleoecological analysis of faunas, especially when compared to high-resolution climate records like those derived from speleothems, ice cores, or marine cores. To this end, we propose saturation dating of indicative taxa and plotting dates of individual specimens against high-resolution climate records rather than analysis of complete faunas or faunules. This technique provides even higher resolution of paleoenvironments than pollen spectra.

  10. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. New approaches for metabolomics by mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vertes, Akos

    Small molecules constitute a large part of the world around us, including fossil and some renewable energy sources. Solar energy harvested by plants and bacteria is converted into energy rich small molecules on a massive scale. Some of the worst contaminants of the environment and compounds of interest for national security also fall in the category of small molecules. The development of large scale metabolomic analysis methods lags behind the state of the art established for genomics and proteomics. This is commonly attributed to the diversity of molecular classes included in a metabolome. Unlike nucleic acids and proteins, metabolites domore » not have standard building blocks, and, as a result, their molecular properties exhibit a wide spectrum. This impedes the development of dedicated separation and spectroscopic methods. Mass spectrometry (MS) is a strong contender in the quest for a quantitative analytical tool with extensive metabolite coverage. Although various MS-based techniques are emerging for metabolomics, many of these approaches include extensive sample preparation that make large scale studies resource intensive and slow. New ionization methods are redefining the range of analytical problems that can be solved using MS. This project developed new approaches for the direct analysis of small molecules in unprocessed samples, as well as pushed the limits of ultratrace analysis in volume limited complex samples. The projects resulted in techniques that enabled metabolomics investigations with enhanced molecular coverage, as well as the study of cellular response to stimuli on a single cell level. Effectively individual cells became reaction vessels, where we followed the response of a complex biological system to external perturbation. We established two new analytical platforms for the direct study of metabolic changes in cells and tissues following external perturbation. For this purpose we developed a novel technique, laser ablation electrospray ionization (LAESI), for metabolite profiling of functioning cells and tissues. The technique was based on microscopic sampling of biological specimens by mid-infrared laser ablation followed by electrospray ionization of the plume and MS analysis. The two main shortcomings of this technique had been limited specificity due to the lack of a separation step, and limited molecular coverage, especially for nonpolar chemical species. To improve specificity and the coverage of the metabolome, we implemented the LAESI ion source on a mass spectrometer with ion mobility separation (IMS). In this system, the gas phase ions produced by the LAESI source were first sorted according to their collisional cross sections in a mobility cell. These separated ion packets were then subjected to MS analysis. By combining the atmospheric pressure ionization with IMS, we improved the metabolite coverage. Further enhancement of the non-polar metabolite coverage resulted from the combination of laser ablation with vacuum UV irradiation of the ablation plume. Our results indicated that this new ionization modality provided improved detection for neutral and non-polar compounds. Based on rapid progress in photonics, we had introduced another novel ion source that utilized the interaction of a laser pulse with silicon nanopost arrays (NAPA). In these nanophotonic ion sources, the structural features were commensurate with the wavelength of the laser light. The enhanced interaction resulted in high ion yields. This ultrasensitive analytical platform enabled the MS analysis of single yeast cells. We extended these NAPA studies from yeast to other microorganisms, including green algae (Chlamydomonas reinhardtii) that captured energy from sunlight on a massive scale. Combining cellular perturbations, e.g., through environmental changes, with the newly developed single cell analysis methods enabled us to follow dynamic changes induced in the cells. In effect, we were able to use individual cells as a “laboratory,” and approached the long-standing goal of establishing a “lab-in-a-cell.” Model systems for these studies included cells of cyanobacteria (Anabaena), yeast (Saccharomyces cerevisiae), green algae (C. reinhardtii) and Arabidopsis thaliana.« less

  12. Quantitative analysis on collagen of dermatofibrosarcoma protuberans skin by second harmonic generation microscopy.

    PubMed

    Wu, Shulian; Huang, Yudian; Li, Hui; Wang, Yunxia; Zhang, Xiaoman

    2015-01-01

    Dermatofibrosarcoma protuberans (DFSP) is a skin cancer usually mistaken as other benign tumors. Abnormal DFSP resection results in tumor recurrence. Quantitative characterization of collagen alteration on the skin tumor is essential for developing a diagnostic technique. In this study, second harmonic generation (SHG) microscopy was performed to obtain images of the human DFSP skin and normal skin. Subsequently, structure and texture analysis methods were applied to determine the differences in skin texture characteristics between the two skin types, and the link between collagen alteration and tumor was established. Results suggest that combining SHG microscopy and texture analysis methods is a feasible and effective method to describe the characteristics of skin tumor like DFSP. © Wiley Periodicals, Inc.

  13. Modelling and analysis of gene regulatory network using feedback control theory

    NASA Astrophysics Data System (ADS)

    El-Samad, H.; Khammash, M.

    2010-01-01

    Molecular pathways are a part of a remarkable hierarchy of regulatory networks that operate at all levels of organisation. These regulatory networks are responsible for much of the biological complexity within the cell. The dynamic character of these pathways and the prevalence of feedback regulation strategies in their operation make them amenable to systematic mathematical analysis using the same tools that have been used with success in analysing and designing engineering control systems. In this article, we aim at establishing this strong connection through various examples where the behaviour exhibited by gene networks is explained in terms of their underlying control strategies. We complement our analysis by a survey of mathematical techniques commonly used to model gene regulatory networks and analyse their dynamic behaviour.

  14. Experimental evaluation criteria for constitutive models of time dependent cyclic plasticity

    NASA Technical Reports Server (NTRS)

    Martin, J. F.

    1986-01-01

    Notched members were tested at temperatures far above those recorded till now. Simulation of the notch root stress response was accomplished to establish notch stress-strain behavior. Cyclic stress-strain profiles across the net-section were recorded and on-line direct notch strain control was accomplished. Data are compared to three analysis techniques with good results. The objective of the study is to generate experimental data that can be used to evaluate the accuracy of constitutive models of time dependent cyclic plasticity.

  15. Development of the Accelerator Mass Spectrometry technology at the Comenius University in Bratislava

    NASA Astrophysics Data System (ADS)

    Povinec, Pavel P.; Masarik, Jozef; Ješkovský, Miroslav; Kaizer, Jakub; Šivo, Alexander; Breier, Robert; Pánik, Ján; Staníček, Jaroslav; Richtáriková, Marta; Zahoran, Miroslav; Zeman, Jakub

    2015-10-01

    An Accelerator Mass Spectrometry (AMS) laboratory has been established at the Centre for Nuclear and Accelerator Technologies (CENTA) at the Comenius University in Bratislava comprising of a MC-SNICS ion source, 3 MV Pelletron tandem accelerator, and an analyzer of accelerated ions. The preparation of targets for 14C and 129I AMS measurements is described in detail. The development of AMS techniques for potassium, uranium and thorium analysis in radiopure materials required for ultra-low background underground experiments is briefly mentioned.

  16. [The characteristics of population health in social ecological conditions of the Primorskiĭ Kraĭ].

    PubMed

    Iarygina, M V; Kiku, P F; Gorborukova, T V; Iudin, S S

    2013-01-01

    The article presents the results of sociological survey of residents of different bioclimatic zones of urbanized and rural territories of different bioclimatic zones of the Primorsky Kraiy in municipalities with different demographic, ecological and social characteristics. The analysis of survey data using the technique of P.V. Terentiyev correlation pleiades was applied. The relationship was established between a number of such factors as social industrial environment, ecological condition of territory of residence, life-style and climate with population health.

  17. Temporal intracavity detection of parasitic infrared absorption in Ti:Sapphire lasers

    NASA Astrophysics Data System (ADS)

    Deleva, A. D.; Peshev, Z. Y.; Aneva, Z. I.

    1993-12-01

    An intracavity technique with temporal sensitivity to optical losses is used to detect parasitic infrared absorption (PIRA) in Ti:sapphire crystals with high active-center concentrations. By means of comparative analysis, re-emission is established of part of the parasitically absorbed energy back into the laser action channel. A method is proposed for approximate quantitative determination of the relative part of re-emitting PIRA-centers with respect to their total number; for the highly-doped crystal described, it is estimated at about 11%.

  18. Root interaction between Bromud tectorum and Poa pratensis: a three-dimensional analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bookman, P.A.; Mack, R.N.

    1982-06-01

    The spatial distribution of roots of two alien grasses, Bromus tectorum and Poa pratensis, grown singly and in a mixture, was examined using a double-labelling radioisotope technique. Interactions between the root systems of these plants led to a restricted B. tectorum rooting volume in P. pratensis neighborhoods greater than or equal to30-d-old. The roots of B. tectorum failed to develop laterally. The altered B. tectorum root systems may contribute to its inability to persist in established P. pratensis swards.

  19. SS/RCS surface tension propellant acquisition/expulsion tankage technology

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The analysis, design, fabrication, and testing of a propellant tank that satisfies the requirements of the space shuttle is presented. This mission presents very stringent and sometimes conflicting requirements. A compartmented-tank device was developed and various ground and drop tower test techniques were employed to verify the design using both subscale and full-scale hardware. Performance was established with scale models and further substantiation was obtained with the full-scale tankage. Fabrication, acceptance, fill and drain, inspection, and other ground handling procedures were developed.

  20. Sensitivity analysis of consumption cycles

    NASA Astrophysics Data System (ADS)

    Jungeilges, Jochen; Ryazanova, Tatyana; Mitrofanova, Anastasia; Popova, Irina

    2018-05-01

    We study the special case of a nonlinear stochastic consumption model taking the form of a 2-dimensional, non-invertible map with an additive stochastic component. Applying the concept of the stochastic sensitivity function and the related technique of confidence domains, we establish the conditions under which the system's complex consumption attractor is likely to become observable. It is shown that the level of noise intensities beyond which the complex consumption attractor is likely to be observed depends on the weight given to past consumption in an individual's preference adjustment.

  1. Morphology of zirconia particles exposed to D.C. arc plasma jet

    NASA Technical Reports Server (NTRS)

    Zaplatynsky, Isidor

    1987-01-01

    Zirconia particles were sprayed into water with an arc plasma gun in order to determine the effect of various gun operating parameters on their morphology. The collected particles were examined by XRD and SEM techniques. A correlation was established between the content of spherical (molten) particles and the operating parameters by visual inspection and regression analysis. It was determined that the composition of the arc gas and the power input were the predominant parameters that affected the melting of zirconia particles.

  2. Elevation-relief ratio, hypsometric integral, and geomorphic area-altitude analysis.

    NASA Technical Reports Server (NTRS)

    Pike, R. J.; Wilson, S. E.

    1971-01-01

    Mathematical proof establishes identity of hypsometric integral and elevation-relief ratio, two quantitative topographic descriptors developed independently of one another for entirely different purposes. Operationally, values of both measures are in excellent agreement for arbitrarily bounded topographic samples, as well as for low-order fluvial watersheds. By using a point-sampling technique rather than planimetry, elevation-relief ratio (defined as mean elevation minus minimum elevation divided by relief) is calculated manually in about a third of the time required for the hypsometric integral.

  3. A statistical forecast model using the time-scale decomposition technique to predict rainfall during flood period over the middle and lower reaches of the Yangtze River Valley

    NASA Astrophysics Data System (ADS)

    Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao

    2018-04-01

    In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.

  4. Implementation of a Collision Probability Prediction Technique for Constellation Maneuver Planning

    NASA Technical Reports Server (NTRS)

    Concha, Marco a.

    2007-01-01

    On March 22, 2006, the Space Technology 5 (ST5) constellation spacecraft were successfully delivered to orbit by a Pegasus XI, launch vehicle. An unexpected relative motion experienced by the constellation after orbit insertion brought about a problem. Soon after launch the observed relative position of the inert rocket body was between the leading and the middle spacecraft within the constellation. The successful planning and execution of an orbit maneuver that would create a fly-by of the rocket body was required to establish the.formation. This maneuver would create a close approach that needed to conform to predefined collision probability requirements. On April 21, 2006, the ST5 "155" spacecraft performed a large orbit maneuver and successfully passed the inert Pegasus 3rd Stage Rocket Body on April 30, 2006 15:20 UTC at a distance of 2.55 km with a Probability of Collision of less than 1.0E-06. This paper will outline the technique that was implemented to establish the safe planning and execution of the fly-by maneuver. The method makes use of Gaussian distribution models of state covariance to determine underlying probabilities of collision that arise under low velocity encounters. Specific numerical examples used for this analysis are discussed in detail. The mechanics of this technique are explained to foster deeper understanding of the concepts presented and to improve existing processes for use in future constellation maneuver planning.

  5. Partial least squares analysis and mixture design for the study of the influence of composition variables on lipidic nanoparticle characteristics.

    PubMed

    Malzert-Fréon, A; Hennequin, D; Rault, S

    2010-11-01

    Lipidic nanoparticles (NP), formulated from a phase inversion temperature process, have been studied with chemometric techniques to emphasize the influence of the four major components (Solutol®, Labrasol®, Labrafac®, water) on their average diameter and their distribution in size. Typically, these NP present a monodisperse size lower than 200 nm, as determined by dynamic light scattering measurements. From the application of the partial least squares (PLS) regression technique to the experimental data collected during definition of the feasibility zone, it was established that NP present a core-shell structure where Labrasol® is well encapsulated and contributes to the structuring of the NP. Even if this solubility enhancer is regarded as a pure surfactant in the literature, it appears that the oil moieties of this macrogolglyceride mixture significantly influence its properties. Furthermore, results have shown that PLS technique can be also used for predictions of sizes for given relative proportions of components and it was established that from a mixture design, the quantitative mixture composition to use in order to reach a targeted size and a targeted polydispersity index (PDI) can be easily predicted. Hence, statistical models can be a useful tool to control and optimize the characteristics in size of NP. © 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  6. Discriminating movements of liquid and gas in the rabbit colon with impedance manometry.

    PubMed

    Mohd Rosli, R; Leibbrandt, R E; Wiklendt, L; Costa, M; Wattchow, D A; Spencer, N J; Brookes, S J; Omari, T I; Dinning, P G

    2018-05-01

    High-resolution impedance manometry is a technique that is well established in esophageal motility studies for relating motor patterns to bolus flow. The use of this technique in the colon has not been established. In isolated segments of rabbit proximal colon, we recorded motor patterns and the movement of liquid or gas boluses with a high-resolution impedance manometry catheter. These detected movements were compared to video recorded changes in gut diameter. Using the characteristic shapes of the admittance (inverse of impedance) and pressure signals associated with gas or liquid flow we developed a computational algorithm for the automated detection of these events. Propagating contractions detected by video were also recorded by manometry and impedance. Neither pressure nor admittance signals alone could distinguish between liquid and gas transit, however the precise relationship between admittance and pressure signals during bolus flow could. Training our computational algorithm upon these characteristic shapes yielded a detection accuracy of 87.7% when compared to gas or liquid bolus events detected by manual analysis. Characterizing the relationship between both admittance and pressure recorded with high-resolution impedance manometry can not only help in detecting luminal transit in real time, but also distinguishes between liquid and gaseous content. This technique holds promise for determining the propulsive nature of human colonic motor patterns. © 2017 John Wiley & Sons Ltd.

  7. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  9. Fast and Accurate Simulation Technique for Large Irregular Arrays

    NASA Astrophysics Data System (ADS)

    Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe

    2018-04-01

    A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.

  10. Visualization of Heart Sounds and Motion Using Multichannel Sensor

    NASA Astrophysics Data System (ADS)

    Nogata, Fumio; Yokota, Yasunari; Kawamura, Yoko

    2010-06-01

    As there are various difficulties associated with auscultation techniques, we have devised a technique for visualizing heart motion in order to assist in the understanding of heartbeat for both doctors and patients. Auscultatory sounds were first visualized using FFT and Wavelet analysis to visualize heart sounds. Next, to show global and simultaneous heart motions, a new technique for visualization was established. The visualization system consists of a 64-channel unit (63 acceleration sensors and one ECG sensor) and a signal/image analysis unit. The acceleration sensors were arranged in a square array (8×8) with a 20-mm pitch interval, which was adhered to the chest surface. The heart motion of one cycle was visualized at a sampling frequency of 3 kHz and quantization of 12 bits. The visualized results showed a typical waveform motion of the strong pressure shock due to closing tricuspid valve and mitral valve of the cardiac apex (first sound), and the closing aortic and pulmonic valve (second sound) in sequence. To overcome difficulties in auscultation, the system can be applied to the detection of heart disease and to the digital database management of the auscultation examination in medical areas.

  11. Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA

    NASA Astrophysics Data System (ADS)

    Chandra, Abhijit; Chattopadhyay, Sudipta

    2015-01-01

    In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.

  12. Remote Sensing the Thermal and Humidity Structure of the Earth's Atmosphere Using the GPS Radio Occultation Technique: Applications in Climate Studies

    NASA Astrophysics Data System (ADS)

    Vergados, P.; Mannucci, A. J.; Ao, C. O.; Verkhoglyadova, O. P.; Iijima, B.

    2017-12-01

    This presentation introduces the fundamentals of the Global Positioning System radio occultation (GPS RO) remote sensing technique in retrieving atmospheric temperature and humidity information and presents the use of these observations in climate research. Our objective is to demonstrate and establish the GPS RO remote sensing technique as a complementary data set to existing state-of-the-art space-based platforms for climate studies. We show how GPS RO measurements at 1.2-1.6 GHz frequency band can be used to infer the upper tropospheric water vapor and temperature feedbacks and we present a decade-long specific humidity (SH) record from January 2007 until December 2015. We cross-compare the GPS RO-estimated climate feedbacks and the SH long-record with independent data sets from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), the European Center for Medium-range Weather Forecasts Re-Analysis Interim (ERA-Interim), and the Atmospheric Infrared Sounder (AIRS) instrument. These cross-comparisons serve as a performance guide for the GPS-RO observations with respect to other data sets by providing an independent measure of climate feedbacks and humidity short-term trends.

  13. Logic programming to predict cell fate patterns and retrodict genotypes in organogenesis.

    PubMed

    Hall, Benjamin A; Jackson, Ethan; Hajnal, Alex; Fisher, Jasmin

    2014-09-06

    Caenorhabditis elegans vulval development is a paradigm system for understanding cell differentiation in the process of organogenesis. Through temporal and spatial controls, the fate pattern of six cells is determined by the competition of the LET-23 and the Notch signalling pathways. Modelling cell fate determination in vulval development using state-based models, coupled with formal analysis techniques, has been established as a powerful approach in predicting the outcome of combinations of mutations. However, computing the outcomes of complex and highly concurrent models can become prohibitive. Here, we show how logic programs derived from state machines describing the differentiation of C. elegans vulval precursor cells can increase the speed of prediction by four orders of magnitude relative to previous approaches. Moreover, this increase in speed allows us to infer, or 'retrodict', compatible genomes from cell fate patterns. We exploit this technique to predict highly variable cell fate patterns resulting from dig-1 reduced-function mutations and let-23 mosaics. In addition to the new insights offered, we propose our technique as a platform for aiding the design and analysis of experimental data. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  14. Low level radioactivity measurements with phoswich detectors using coincident techniques and digital pulse processing analysis.

    PubMed

    de la Fuente, R; de Celis, B; del Canto, V; Lumbreras, J M; de Celis Alonso, B; Martín-Martín, A; Gutierrez-Villanueva, J L

    2008-10-01

    A new system has been developed for the detection of low radioactivity levels of fission products and actinides using coincidence techniques. The device combines a phoswich detector for alpha/beta/gamma-ray recognition with a fast digital card for electronic pulse analysis. The phoswich can be used in a coincident mode by identifying the composed signal produced by the simultaneous detection of alpha/beta particles and X-rays/gamma particles. The technique of coincidences with phoswich detectors was proposed recently to verify the Nuclear Test Ban Treaty (NTBT) which established the necessity of monitoring low levels of gaseous fission products produced by underground nuclear explosions. With the device proposed here it is possible to identify the coincidence events and determine the energy and type of coincident particles. The sensitivity of the system has been improved by employing liquid scintillators and a high resolution low energy germanium detector. In this case it is possible to identify simultaneously by alpha/gamma coincidence transuranic nuclides present in environmental samples without necessity of performing radiochemical separation. The minimum detectable activity was estimated to be 0.01 Bq kg(-1) for 0.1 kg of soil and 1000 min counting.

  15. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  16. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  17. Using data mining techniques to predict the severity of bicycle crashes.

    PubMed

    Prati, Gabriele; Pietrantoni, Luca; Fraboni, Federico

    2017-04-01

    To investigate the factors predicting severity of bicycle crashes in Italy, we used an observational study of official statistics. We applied two of the most widely used data mining techniques, CHAID decision tree technique and Bayesian network analysis. We used data provided by the Italian National Institute of Statistics on road crashes that occurred on the Italian road network during the period ranging from 2011 to 2013. In the present study, the dataset contains information about road crashes occurred on the Italian road network during the period ranging from 2011 to 2013. We extracted 49,621 road accidents where at least one cyclist was injured or killed from the original database that comprised a total of 575,093 road accidents. CHAID decision tree technique was employed to establish the relationship between severity of bicycle crashes and factors related to crash characteristics (type of collision and opponent vehicle), infrastructure characteristics (type of carriageway, road type, road signage, pavement type, and type of road segment), cyclists (gender and age), and environmental factors (time of the day, day of the week, month, pavement condition, and weather). CHAID analysis revealed that the most important predictors were, in decreasing order of importance, road type (0.30), crash type (0.24), age of cyclist (0.19), road signage (0.08), gender of cyclist (0.07), type of opponent vehicle (0.05), month (0.04), and type of road segment (0.02). These eight most important predictors of the severity of bicycle crashes were included as predictors of the target (i.e., severity of bicycle crashes) in Bayesian network analysis. Bayesian network analysis identified crash type (0.31), road type (0.19), and type of opponent vehicle (0.18) as the most important predictors of severity of bicycle crashes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Reoperation for rhegmatogenous retinal detachment as quality indicator for disease management: a register study.

    PubMed

    Hajari, Javad N; Christensen, Ulrik; Kiilgaard, Jens F; Bek, Toke; la Cour, Morten

    2015-09-01

    To establish a quality indicator that could be used in optimizing treatment for rhegmatogenous retinal detachment (RRD). The Danish National Patient Registry was used to identify surgery conducted in Denmark for RRD in the period 01 January 2001-31 December 2009. Cases were identified by diagnosis and surgical codes. A total of 6522 cases were operated for a primary RRD in the study period, and 22% (1434 patients) were reoperated for a redetachment. A Cox regression analysis showed that the risk of redetachment was equal to or less than detachment on the fellow eye 1 year after primary surgery with techniques not using silicone oil. The same was true 1.5 years after surgery for techniques using silicone oil. Based on this, we established a quality indicator defining failure as the need for operation for redetachment within 1 year from initial surgery when using techniques without oil and after 1.5 years for techniques using oil. Also the lack of oil removal within 1 year from initial surgery should be noted as an operational failure. We applied the quality indicators on the cohort of 6522 RRDs and found that in Denmark the need for redetachment surgery has decreased over time and also that high-volume departments have better outcome compared to smaller ones. The risk of reoperation for redetachment after initial surgery fulfils the criteria for a good quality indicator and can be used in RRD surgery. This indicator could aid in optimizing the management of RRD patients to minimize morbidity. © 2015 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  19. Experimentally induced innovations lead to persistent culture via conformity in wild birds

    PubMed Central

    Aplin, L.M.; Farine, D.R.; Morand-Ferron, J.; Cockburn, A.; Thornton, A.; Sheldon, B.C.

    2014-01-01

    In human societies, cultural norms arise when behaviours are transmitted with high-fidelity social learning through social networks1. However a paucity of experimental studies has meant that there is no comparable understanding of the process by which socially transmitted behaviours may spread and persist in animal populations2,3. Here, we introduce alternative novel foraging techniques into replicated wild sub-populations of great tits (Parus major), and employ automated tracking to map the diffusion, establishment and long-term persistence of seeded behaviours. We further use social network analysis to examine social factors influencing diffusion dynamics. From just two trained birds in each sub-population, information spread rapidly through social network ties to reach an average of 75% of individuals, with 508 knowledgeable individuals performing 58,975 solutions. Sub-populations were heavily biased towards the technique originally introduced, resulting in established local arbitrary traditions that were stable over two generations, despite high population turnover. Finally, we demonstrate a strong effect of social conformity, with individuals disproportionately adopting the most frequent local variant when first learning, but then also continuing to favour social over personal information by matching their technique to the majority variant. Cultural conformity is thought to be a key factor in the evolution of complex culture in humans4-7. In providing the first experimental demonstration of conformity in a wild non-primate, and of cultural norms in foraging techniques in any wild animal, our results suggest a much wider evolutionary occurrence of such apparently complex cultural behaviour. PMID:25470065

  20. SU-F-T-248: FMEA Risk Analysis Implementation (AAPM TG-100) in Total Skin Electron Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    2016-06-15

    Purpose: Total Skin Electron Irradiation (TSEI) is a radiotherapy treatment which involves irradiating the entire body surface as homogeneously as possible. It is composed of an extensive multi-step technique in which quality management requires high consumption of resources and a fluid communication between the involved staff, necessary to improve the safety of treatment. The TG-100 proposes a new perspective of quality management in radiotherapy, presenting a systematic method of risk analysis throughout the global flow of the stages through the patient. The purpose of this work has been to apply TG-100 approach to the TSEI procedure in our institution. Methods:more » A multidisciplinary team specifically targeting TSEI procedure was formed, that met regularly and jointly developed the process map (PM), following TG-100 guidelines of the AAPM. This PM is a visual representation of the temporal flow of steps through the patient since start until the end of his stay in the radiotherapy service. Results: This is the first stage of the full risk analysis, which is being carried out in the center. The PM provides an overview of the process and facilitates the understanding of the team members who will participate in the subsequent analysis. Currently, the team is implementing the analysis of failure modes and effects (FMEA). The failure modes of each of the steps have been identified and assessors are assigning a value of severity (S), frequency of occurrence (O) and lack of detection (D) individually. To our knowledge, this is the first PM made for the TSEI. The developed PM can be useful for those centers that intend to implement the TSEI technique. Conclusion: The PM of TSEI technique has been established, as the first stage of full risk analysis, performed in a reference center in this treatment.« less

  1. [Chronic Inflammatory Demyelinating Polyneuropathy].

    PubMed

    Balke, M; Wunderlich, G; Brunn, A; Fink, G R; Lehmann, H C

    2016-12-01

    Chronic inflammatory demyelinating polyneuropathy (CIDP) is a chronic progressive or relapsing autoimmune neuropathy with heterogeneous clinical presentation. Symptoms typically include symmetrical, proximal and/or distal paresis and sensory loss. Atypical CIDP variants are increasingly recognized, including subtypes with rapid onset as well as variants with pure sensory, focal or marked asymmetrical deficits. Diagnosis is established by compatible symptoms, characteristic electrophysiological features and cerebrospinal fluid analysis. In unequivocal cases, inflammatory infiltrates in sural nerve biopsy support the diagnosis. Recent studies suggest that diagnostic imaging techniques such as MRI and nerve ultrasound may become useful tools for establishing the diagnosis. First-line therapies include immunoglobulines, steroids, and plasmapheresis. Immunosuppressant agents and monoclonal antibodies are used in therapy-refractory cases or as cortison-saving agents. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Design and fabrication of prototype system for early warning of impending bearing failure

    NASA Technical Reports Server (NTRS)

    Broderick, J. J.; Burchill, R. F.; Clark, H. L.

    1972-01-01

    Ball bearing performance tests run on several identical ball bearings under a variety of load, speed, temperature, and lubrication conditions are reported. Bearing temperature, torque, vibration, noise, strain, cage speed, etc., were monitored to establish those measurements most suitable as indicators of ball bearing health. Tape records were made under steady-state conditions of a variety of speeds and loads. Sample sections were selected for narrowband spectral analysis with a real time analyzer. An artificial flow was created across the inner race surface of one bearing using an acid etch technique to produce the scratch. Tape records obtained before and after established a characteristic frequency response that identifies the presence of the flow. The signals found most useful as indicators of performance degradation were ultrasonic outputs.

  3. The theory of an auto-resonant field emission cathode relativistic electron accelerator for high efficiency microwave to direct current power conversion

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1990-01-01

    A novel method of microwave power conversion to direct current is discussed that relies on a modification of well known resonant linear relativistic electron accelerator techniques. An analysis is presented that shows how, by establishing a 'slow' electromagnetic field in a waveguide, electrons liberated from an array of field emission cathodes, are resonantly accelerated to several times their rest energy, thus establishing an electric current over a large potential difference. Such an approach is not limited to the relatively low frequencies that characterize the operation of rectennas, and can, with appropriate waveguide and slow wave structure design, be employed in the 300 to 600 GHz range where much smaller transmitting and receiving antennas are needed.

  4. Quantitative sensory testing response patterns to capsaicin- and ultraviolet-B-induced local skin hypersensitization in healthy subjects: a machine-learned analysis.

    PubMed

    Lötsch, Jörn; Geisslinger, Gerd; Heinemann, Sarah; Lerch, Florian; Oertel, Bruno G; Ultsch, Alfred

    2017-08-16

    The comprehensive assessment of pain-related human phenotypes requires combinations of nociceptive measures that produce complex high-dimensional data, posing challenges to bioinformatic analysis. In this study, we assessed established experimental models of heat hyperalgesia of the skin, consisting of local ultraviolet-B (UV-B) irradiation or capsaicin application, in 82 healthy subjects using a variety of noxious stimuli. We extended the original heat stimulation by applying cold and mechanical stimuli and assessing the hypersensitization effects with a clinically established quantitative sensory testing (QST) battery (German Research Network on Neuropathic Pain). This study provided a 246 × 10-sized data matrix (82 subjects assessed at baseline, following UV-B application, and following capsaicin application) with respect to 10 QST parameters, which we analyzed using machine-learning techniques. We observed statistically significant effects of the hypersensitization treatments in 9 different QST parameters. Supervised machine-learned analysis implemented as random forests followed by ABC analysis pointed to heat pain thresholds as the most relevantly affected QST parameter. However, decision tree analysis indicated that UV-B additionally modulated sensitivity to cold. Unsupervised machine-learning techniques, implemented as emergent self-organizing maps, hinted at subgroups responding to topical application of capsaicin. The distinction among subgroups was based on sensitivity to pressure pain, which could be attributed to sex differences, with women being more sensitive than men. Thus, while UV-B and capsaicin share a major component of heat pain sensitization, they differ in their effects on QST parameter patterns in healthy subjects, suggesting a lack of redundancy between these models.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  5. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  6. ASTM clustering for improving coal analysis by near-infrared spectroscopy.

    PubMed

    Andrés, J M; Bona, M T

    2006-11-15

    Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.

  7. REX2000 Version 2.5: Improved DATA Handling and Enhanced User-Interface

    NASA Astrophysics Data System (ADS)

    Taguchi, Takeyoshi

    2007-02-01

    XAFS analysis can be applied to various fields such as material science, environmental study, biological science, etc. and is widely used for characterization in those fields. In the early days that XAFS technique was started to be used, scientists wrote their own code for XAFS data analysis. As XAFS technique became very popular and XAFS community grew big, a several analysis code or package had been developed and submitted for people to use. The REX2000 is one of those XAFS analysis packages, which is commercially available. Counting up from its predecessor "REX", REX2000 has been used for more than 15 years in XAFS society. From the previous modification in 2003, a major change was made in this year of 2006. For a dynamical study of advanced material, many XAFS DATA were measured (quick XAFS and in-situ XAFS) and hundreds of DATA sets need to be processed. The REX2000's DATA handling is improved to cope with those huge volume DATA at once and report the fitting result as CSV file. Well-established user-interface is enhanced so that user can customize initial values for data analysis and specify the options through graphical interface. Many small changes are made and described in this paper.

  8. Temporal variability and climatology of hydrodynamic, water property and water quality parameters in the West Johor Strait of Singapore.

    PubMed

    Behera, Manasa Ranjan; Chun, Cui; Palani, Sundarambal; Tkalich, Pavel

    2013-12-15

    The study presents a baseline variability and climatology study of measured hydrodynamic, water properties and some water quality parameters of West Johor Strait, Singapore at hourly-to-seasonal scales to uncover their dependency and correlation to one or more drivers. The considered parameters include, but not limited by sea surface elevation, current magnitude and direction, solar radiation and air temperature, water temperature, salinity, chlorophyll-a and turbidity. FFT (Fast Fourier Transform) analysis is carried out for the parameters to delineate relative effect of tidal and weather drivers. The group and individual correlations between the parameters are obtained by principal component analysis (PCA) and cross-correlation (CC) technique, respectively. The CC technique also identifies the dependency and time lag between driving natural forces and dependent water property and water quality parameters. The temporal variability and climatology of the driving forces and the dependent parameters are established at the hourly, daily, fortnightly and seasonal scales. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Assessment of water quality monitoring for the optimal sensor placement in lake Yahuarcocha using pattern recognition techniques and geographical information systems.

    PubMed

    Jácome, Gabriel; Valarezo, Carla; Yoo, Changkyoo

    2018-03-30

    Pollution and the eutrophication process are increasing in lake Yahuarcocha and constant water quality monitoring is essential for a better understanding of the patterns occurring in this ecosystem. In this study, key sensor locations were determined using spatial and temporal analyses combined with geographical information systems (GIS) to assess the influence of weather features, anthropogenic activities, and other non-point pollution sources. A water quality monitoring network was established to obtain data on 14 physicochemical and microbiological parameters at each of seven sample sites over a period of 13 months. A spatial and temporal statistical approach using pattern recognition techniques, such as cluster analysis (CA) and discriminant analysis (DA), was employed to classify and identify the most important water quality parameters in the lake. The original monitoring network was reduced to four optimal sensor locations based on a fuzzy overlay of the interpolations of concentration variations of the most important parameters.

  10. Low-loss Ca{sub 5-x}Sr{sub x}A{sub 2}TiO{sub 12} [A=Nb,Ta] ceramics: Microwave dielectric properties and vibrational spectroscopic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bijumon, Pazhoor Varghese; Sebastian, Mailadil Thomas; Dias, Anderson

    2005-05-15

    Complex perovskite-type Ca{sub 5-x}Sr{sub x}A{sub 2}TiO{sub 12} [A=Nb,Ta] (0{<=}x{<=}5) ceramics were prepared by conventional solid-state ceramic route. The crystal structure, microwave dielectric properties, and vibrational spectroscopic characteristics of these materials are reported. The structure and microstructure were investigated by x-ray diffraction and scanning electron microscopy techniques. The microwave dielectric properties were measured in the 3-5-GHz frequency range by the resonance method. Structural evolutions from orthorhombic to an averaged pseudocubic phase, with associated changes in dielectric properties, were observed as a function of composition. The structure-property relationships in these ceramics were established using Raman and Fourier transform infrared spectroscopic techniques. Ramanmore » analysis showed characteristic bands of ordered perovskite materials, with variation in both intensity and frequency as a function of composition.« less

  11. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  12. A single cell high content assay detects mitochondrial dysfunction in iPSC-derived neurons with mutations in SNCA.

    PubMed

    Little, Daniel; Luft, Christin; Mosaku, Olukunbi; Lorvellec, Maëlle; Yao, Zhi; Paillusson, Sébastien; Kriston-Vizi, Janos; Gandhi, Sonia; Abramov, Andrey Y; Ketteler, Robin; Devine, Michael J; Gissen, Paul

    2018-06-13

    Mitochondrial dysfunction is implicated in many neurodegenerative diseases including Parkinson's disease (PD). Induced pluripotent stem cells (iPSCs) provide a unique cell model for studying neurological diseases. We have established a high-content assay that can simultaneously measure mitochondrial function, morphology and cell viability in iPSC-derived dopaminergic neurons. iPSCs from PD patients with mutations in SNCA and unaffected controls were differentiated into dopaminergic neurons, seeded in 384-well plates and stained with the mitochondrial membrane potential dependent dye TMRM, alongside Hoechst-33342 and Calcein-AM. Images were acquired using an automated confocal screening microscope and single cells were analysed using automated image analysis software. PD neurons displayed reduced mitochondrial membrane potential and altered mitochondrial morphology compared to control neurons. This assay demonstrates that high content screening techniques can be applied to the analysis of mitochondria in iPSC-derived neurons. This technique could form part of a drug discovery platform to test potential new therapeutics for PD and other neurodegenerative diseases.

  13. Fluorescence Imaging of Posterior Spiracles from Second and Third Instars of Forensically-important Chrysomya rufifacies (Diptera: Calliphoridae)*

    PubMed Central

    Flores, Danielle; Miller, Amy L.; Showman, Angelique; Tobita, Caitlyn; Shimoda, Lori M.N.; Sung, Carl; Stokes, Alexander J.; Tomberlin, Jeffrey K.; Carter, David O.; Turner, Helen

    2016-01-01

    Entomological protocols for aging blow fly (Diptera: Calliphoridae) larvae to estimate the time of colonization (TOC) are commonly used to assist in death investigations. While the methodologies for analysing fly larvae differ, most rely on light microscopy, genetic analysis or, more rarely, electron microscopy. This pilot study sought to improve resolution of larval stage in the forensically-important blow fly Chrysomya rufifacies using high-content fluorescence microscopy and biochemical measures of developmental marker proteins. We established fixation and mounting protocols, defined a set of measurable morphometric criteria and captured developmental transitions of 2nd instar to 3rd instar using both fluorescence microscopy and anti-ecdysone receptor Western blot analysis. The data show that these instars can be distinguished on the basis of robust, non-bleaching, autofluorescence of larval posterior spiracles. High content imaging techniques using confocal microscopy, combined with morphometric and biochemical techniques, may therefore aid forensic entomologists in estimating TOC. PMID:27706817

  14. Vapor phase diamond growth technology

    NASA Technical Reports Server (NTRS)

    Angus, J. C.

    1981-01-01

    Ion beam deposition chambers used for carbon film generation were designed and constructed. Features of the developed equipment include: (1) carbon ion energies down to approx. 50 eV; (2) in suit surface monitoring with HEED; (3) provision for flooding the surface with ultraviolet radiation; (4) infrared laser heating of substrate; (5) residual gas monitoring; (6) provision for several source gases, including diborane for doping studies; and (7) growth from either hydrocarbon source gases or from carbon/argon arc sources. Various analytical techniques for characterization of from carbon/argon arc sources. Various analytical techniques for characterization of the ion deposited carbon films used to establish the nature of the chemical bonding and crystallographic structure of the films are discussed. These include: H2204/HN03 etch; resistance measurements; hardness tests; Fourier transform infrared spectroscopy; scanning auger microscopy; electron spectroscopy for chemical analysis; electron diffraction and energy dispersive X-ray analysis; electron energy loss spectroscopy; density measurements; secondary ion mass spectroscopy; high energy electron diffraction; and electron spin resonance. Results of the tests are summarized.

  15. Analysis and experimental evaluation of shunt active power filter for power quality improvement based on predictive direct power control.

    PubMed

    Aissa, Oualid; Moulahoum, Samir; Colak, Ilhami; Babes, Badreddine; Kabache, Nadir

    2017-10-12

    This paper discusses the use of the concept of classical and predictive direct power control for shunt active power filter function. These strategies are used to improve the active power filter performance by compensation of the reactive power and the elimination of the harmonic currents drawn by non-linear loads. A theoretical analysis followed by a simulation using MATLAB/Simulink software for the studied techniques has been established. Moreover, two test benches have been carried out using the dSPACE card 1104 for the classic and predictive DPC control to evaluate the studied methods in real time. Obtained results are presented and compared in this paper to confirm the superiority of the predictive technique. To overcome the pollution problems caused by the consumption of fossil fuels, renewable energies are the alternatives recommended to ensure green energy. In the same context, the tested predictive filter can easily be supplied by a renewable energy source that will give its impact to enhance the power quality.

  16. Long-term analysis of Zostera noltei: A retrospective approach for understanding seagrasses' dynamics.

    PubMed

    Calleja, Felipe; Galván, Cristina; Silió-Calzada, Ana; Juanes, José A; Ondiviela, Bárbara

    2017-09-01

    Long-term studies are necessary to establish trends and to understand seagrasses' spatial and temporal dynamic. Nevertheless, this type of research is scarce, as the required databases are often unavailable. The objectives of this study are to create a method for mapping the seagrass Zostera noltei using remote sensing techniques, and to apply it to the characterization of the meadows' extension trend and the potential drivers of change. A time series was created using a novel method based on remote sensing techniques that proved to be adequate for mapping the seagrass in the emerged intertidal. The meadows seem to have a decreasing trend between 1984 and the early 2000s, followed by an increasing tendency that represents a recovery in the extension area of the species. This 30-year analysis demonstrated the Z. noltei's recovery in the study site, similar to that in other estuaries nearby and contrary to the worldwide decreasing behavior of seagrasses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Technical and Economic Assessment of Span-Loaded Cargo Aircraft Concepts

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The benefits are assessed of span distributed loading concepts as applied to future commercial air cargo operations. A two phased program is used to perform this assessment. The first phase consists of selected parametric studies to define significant configuration, performance, and economic trends. The second phase consists of more detailed engineering design, analysis, and economic evaluations to define the technical and economic feasibility of a selected spanloader design. A conventional all-cargo aircraft of comparable technology and size is used as a comparator system. The technical feasibility is demonstrated of the spanloader concept with no new major technology efforts required to implement the system. However, certain high pay-off technologies such as winglets, airfoil design, and advanced structural materials and manufacturing techniques need refinement and definition prior to application. In addition, further structural design analysis could establish the techniques and criteria necessary to fully capitalize upon the high degree of structural commonality and simplicity inherent in the spanloader concept.

  18. A Data Analysis Expert System For Large Established Distributed Databases

    NASA Astrophysics Data System (ADS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  19. Is it possible to re-establish pre-operative patellar kinematics using a ligament-balanced technique in total knee arthroplasty? A cadaveric investigation.

    PubMed

    Keshmiri, Armin; Springorum, Hans; Baier, Clemens; Zeman, Florian; Grifka, Joachim; Maderbacher, Günther

    2015-03-01

    Several authors emphasise that the appearance of patellar maltracking after total knee arthroplasty (TKA) is caused by rotational malalignment of the femoral and tibial components. Ligament-balanced femoral component rotation was not found to be associated with abnormal postoperative patellar position. We hypothesised that a ligament-balanced technique in TKA has the ability to best re-establish patellar kinematics. In ten cadaveric knees TKA was performed assessing femoral rotation in ligament-balanced and different femoral and tibial component rotation alignments. Patellar kinematics after different component rotations were analysed using a commercial computer navigation system. Ligament-balanced femoral rotation showed the best re-establishment of patellar kinematics after TKA compared to the healthy pre-operative knee. In contrast to tibial component rotation, femoral component rotation had a major impact on patellofemoral kinematics. This investigation suggests that a ligament-balanced technique in TKA is most likely to re-establish natural patellofemoral kinematics. Tibial component rotation did not influence patellar kinematics.

  20. A study to define an in-flight dynamics measurement and data applications program for space shuttle payloads

    NASA Technical Reports Server (NTRS)

    Rader, W. P.; Barrett, S.; Payne, K. R.

    1975-01-01

    Data measurement and interpretation techniques were defined for application to the first few space shuttle flights, so that the dynamic environment could be sufficiently well established to be used to reduce the cost of future payloads through more efficient design and environmental test techniques. It was concluded that: (1) initial payloads must be given comprehensive instrumentation coverage to obtain detailed definition of acoustics, vibration, and interface loads, (2) analytical models of selected initial payloads must be developed and verified by modal surveys and flight measurements, (3) acoustic tests should be performed on initial payloads to establish realistic test criteria for components and experiments in order to minimize unrealistic failures and retest requirements, (4) permanent data banks should be set up to establish statistical confidence in the data to be used, (5) a more unified design/test specification philosophy is needed, (6) additional work is needed to establish a practical testing technique for simulation of vehicle transients.

  1. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective collaboration of theoreticians, mathematicians and experimentalists of the institute to solve such tasks.« less

  2. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules.

  3. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. Conclusion The paper demonstrates how Petri net analysis techniques can promote a deeper understanding of signal transduction pathways. The new concepts of feasible t-invariants and MCT-sets have been proven to be useful for model validation and the interpretation of the biological system behaviour. Whereas MCT-sets provide a decomposition of the net into disjunctive subnets, feasible t-invariants describe subnets, which generally overlap. This work contributes to qualitative modelling and to the analysis of large biological networks by their fully automatic decomposition into biologically meaningful modules. PMID:17081284

  4. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  5. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  6. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  7. Gap Analysis and Conservation Network for Freshwater Wetlands in Central Yangtze Ecoregion

    PubMed Central

    Xiaowen, Li; Haijin, Zhuge; Li, Mengdi

    2013-01-01

    The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces. PMID:24062632

  8. {sup 1}H NMR spectroscopic studies establish that heparanase is a retaining glycosidase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Jennifer C., E-mail: jennifer.wilson@griffith.edu.au; Laloo, Andrew Elohim; Singh, Sanjesh

    2014-01-03

    Highlights: •{sup 1}H and {sup 13}C NMR chemical shifts of fondaparinux were fully assigned by 1D and 2D NMR techniques. •Hydrolysis of fondaparinux by heparanase was monitored by {sup 1}H NMR spectroscopy. •Heparanase is established to be a retaining glycosidase. -- Abstract: Heparanase is an endo-β-glucuronidase that cleaves heparan sulfate side chains of proteoglycans in basement membranes and the extracellular matrix (ECM). Heparanase is implicated in several diverse pathological processes associated with ECM degradation such as metastasis, inflammation and angiogenesis and is thus an important target for anti-cancer and anti-inflammatory drug discovery. Heparanase has been classed as belonging to themore » clan A glycoside hydrolase family 79 based on sequence analysis, secondary structure predictions and mutagenic analysis, and thus it has been inferred that it is a retaining glycosidase. However, there has been no direct experimental evidence to support this conclusion. Herein we describe {sup 1}H NMR spectroscopic studies of the hydrolysis of the pentasaccharide substrate fondaparinux by heparanase, and provide conclusive evidence that heparanase hydrolyses its substrate with retention of configuration and is thus established as a retaining glycosidase. Knowledge of the mechanism of hydrolysis may have implications for future design of inhibitors for this important drug target.« less

  9. Relationship between the UPLC-Q-TOF-MS fingerprinted constituents from Daphne genkwa and their anti-inflammatory, anti-oxidant activities.

    PubMed

    Du, Wen-Juan; Ji, Jun; Wang, Ling; Lan, Xin-Yi; Li, Jia; Lei, Jun-Qiu; He, Xin; Zhang, Chun-Feng; Huang, Wen-Zhe; Wang, Zhen-Zhong; Xiao, Wei; Wang, Chong-Zhi; Yuan, Chun-Su

    2017-12-01

    Daphne genkwa Sieb.et Zucc. is a well-known medicinal plant. This study was designed to apply the ultra-high performance liquid chromatography system to establish a quality control method for D. genkwa. Data revealed that there were 15 common peaks in 10 batches of D. genkwa Sieb. Et Zucc. (Thymelaeaceae) from different provinces of China. On this basis, the fingerprint chromatogram was established to provide references for quality control. Afterwards, the chemical constitutions of these common peaks were analyzed using the UPLC-Q-TOF-MS system and nine of them were identified. In addition, LPS-stimulated RAW264.7 murine macrophages and DPPH assay were used to study the anti-inflammatory and anti-oxidation effects of D. genkwa. Then the fingerprint-efficacy relationships between UPLC fingerprints and pharmacodynamic data were studied with canonical correlation analysis. Analysis results indicated that the anti-inflammatory and anti-oxidation effects differed among the 10 D. genkwa samples owing to their inherent differences of chemical compositions. Taken together, this research established a fingerprint-efficacy relationship model of D. genkwa plant by combining the UPLC analytic technique and pharmacological research, which provided references for the detection of the principal components of traditional Chinese medicine on bioactivity. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Considerations for the establishment of a machinery monitoring analysis program for surface ships of the US Navy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strunk, W.D.

    1987-01-01

    Personnel at the Oak Ridge National Laboratory were tasked by the US Navy to assist in establishing a maintenance monitoring program for machinery aboard surface ships. Given the number of surface ships, the variety of locations in which they operate, the different types of equipment (rotating and reciprocating, as well as instrumentation), and the different procedures which control the operation and maintenance of a ship, it can be seen, apart from the logistics of organizing such a monitoring program, that the technical issues are as varied and numerous as the ships themselves. Unique methods and procedures have been developed tomore » perform the tasks required on a large scale. Among the specific tasks and technical issues addressed were the development and installation of a data collection and communication instrumentation system for each port, the qualification of measurement methodologies and techniques, the establishment of computer data bases, the evaluation of the instrumentation used, training of civilian and military personnel, development of machinery condition assessment aids using machine design and modal analysis information, and development of computer displays. After these tasks were completed and the appropriate resolution integrated into the program, the final task was the development of a method to continually evaluate the effectiveness of the program, using actual maintenance records.« less

  11. Gap analysis and conservation network for freshwater wetlands in Central Yangtze Ecoregion.

    PubMed

    Xiaowen, Li; Haijin, Zhuge; Li, Mengdi

    2013-01-01

    The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces.

  12. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  13. Pancreatic thickness as a predictive factor for postoperative pancreatic fistula after distal pancreatectomy using an endopath stapler.

    PubMed

    Okano, Keiichi; Oshima, Minoru; Kakinoki, Keitaro; Yamamoto, Naoki; Akamoto, Shintaro; Yachida, Shinichi; Hagiike, Masanobu; Kamada, Hideki; Masaki, Tsutomu; Suzuki, Yasuyuki

    2013-02-01

    No consistent risk factor has yet been established for the development of pancreatic fistula (PF) after distal pancreatectomy (DP) with a stapler. A total of 31 consecutive patients underwent DP with an endopath stapler between June 2006 and December 2010 using a slow parenchymal flattening technique. The risk factors for PF after DP with an endopath stapler were identified based on univariate and multivariate analyses. Clinical PF developed in 7 of 31 (22 %) patients who underwent DP with a stapler. The pancreata were significantly thicker at the transection line in patients with PF (19.4 ± 1.47 mm) in comparison to patients without PF (12.6 ± 0.79 mm; p = 0.0003). A 16-mm cut-off for pancreatic thickness was established based on the receiver operating characteristic (ROC) curve; the area under the ROC curve was 0.875 (p = 0.0215). Pancreatic thickness (p = 0.0006) and blood transfusion (p = 0.028) were associated with postoperative PF in a univariate analysis. Pancreatic thickness was the only significant independent factor (odds ratio 9.99; p = 0.036) according to a multivariate analysis with a specificity of 72 %, and a sensitivity of 85 %. Pancreatic thickness is a significant independent risk factor for PF development after DP with an endopath stapler. The stapler technique is thus considered to be an appropriate modality in patients with a pancreatic thicknesses of <16 mm.

  14. Observing hydrological processes: recent advancements in surface flow monitoring through image analysis

    NASA Astrophysics Data System (ADS)

    Tauro, Flavia; Grimaldi, Salvatore

    2017-04-01

    Recently, several efforts have been devoted to the design and development of innovative, and often unintended, approaches for the acquisition of hydrological data. Among such pioneering techniques, this presentation reports recent advancements towards the establishment of a novel noninvasive and potentially continuous methodology based on the acquisition and analysis of images for spatially distributed observations of the kinematics of surface waters. The approach aims at enabling rapid, affordable, and accurate surface flow monitoring of natural streams. Flow monitoring is an integral part of hydrological sciences and is essential for disaster risk reduction and the comprehension of natural phenomena. However, water processes are inherently complex to observe: they are characterized by multiscale and highly heterogeneous phenomena which have traditionally demanded sophisticated and costly measurement techniques. Challenges in the implementation of such techniques have also resulted in lack of hydrological data during extreme events, in difficult-to-access environments, and at high temporal resolution. By combining low-cost yet high-resolution images and several velocimetry algorithms, noninvasive flow monitoring has been successfully conducted at highly heterogeneous scales, spanning from rills to highly turbulent streams, and medium-scale rivers, with minimal supervision by external users. Noninvasive image data acquisition has also afforded observations in high flow conditions. Latest novelties towards continuous flow monitoring at the catchment scale have entailed the development of a remote gauge-cam station on the Tiber River and integration of flow monitoring through image analysis with unmanned aerial systems (UASs) technology. The gauge-cam station and the UAS platform both afford noninvasive image acquisition and calibration through an innovative laser-based setup. Compared to traditional point-based instrumentation, images allow for generating surface flow velocity maps which fully describe the kinematics of the velocity field in natural streams. Also, continuous observations provide a close picture of the evolving dynamics of natural water bodies. Despite such promising achievements, dealing with images also involves coping with adverse illumination, massive data handling and storage, and data-intensive computing. Most importantly, establishing a novel observational technique requires estimation of the uncertainty associated to measurements and thorough comparison to existing benchmark approaches. In this presentation, we provide answers to some of these issues and perspectives for future research.

  15. Enrichment and single-cell analysis of circulating tumor cells

    PubMed Central

    Song, Yanling; Tian, Tian; Shi, Yuanzhi; Liu, Wenli; Zou, Yuan; Khajvand, Tahereh; Wang, Sili; Zhu, Zhi

    2017-01-01

    Up to 90% of cancer-related deaths are caused by metastatic cancer. Circulating tumor cells (CTCs), a type of cancer cell that spreads through the blood after detaching from a solid tumor, are essential for the establishment of distant metastasis for a given cancer. As a new type of liquid biopsy, analysis of CTCs offers the possibility to avoid invasive tissue biopsy procedures with practical implications for diagnostics. The fundamental challenges of analyzing and profiling CTCs are the extremely low abundances of CTCs in the blood and the intrinsic heterogeneity of CTCs. Various technologies have been proposed for the enrichment and single-cell analysis of CTCs. This review aims to provide in-depth insights into CTC analysis, including various techniques for isolation of CTCs with capture methods based on physical and biochemical principles, and single-cell analysis of CTCs at the genomic, proteomic and phenotypic level, as well as current developmental trends and promising research directions. PMID:28451298

  16. Design and Analysis of a Stiffened Composite Structure Repair Concept

    NASA Technical Reports Server (NTRS)

    Przekop, Adam

    2011-01-01

    A design and analysis of a repair concept applicable to a stiffened thin-skin composite panel based on the Pultruded Rod Stitched Efficient Unitized Structure is presented. Since the repair concept is a bolted repair using metal components, it can easily be applied in the operational environment. Initial analyses are aimed at validating the finite element modeling approach by comparing with available test data. Once confidence in the analysis approach is established several repair configurations are explored and the most efficient one presented. Repairs involving damage to the top of the stiffener alone are considered in addition to repairs involving a damaged stiffener, flange and underlying skin. High fidelity finite element modeling techniques such as mesh-independent definition of compliant fasteners, elastic-plastic metallic material properties and geometrically nonlinear analysis are utilized in the effort. The results of the analysis are presented and factors influencing the design are assessed and discussed.

  17. Identifying 1st instar larvae for three forensically important blowfly species using "fingerprint" cuticular hydrocarbon analysis.

    PubMed

    Moore, Hannah E; Adam, Craig D; Drijfhout, Falko P

    2014-07-01

    Calliphoridae are known to be the most forensically important insects when it comes to establishing the minimum post mortem interval (PMImin) in criminal investigations. The first step in calculating the PMImin is to identify the larvae present to species level. Accurate identification which is conventionally carried out by morphological analysis is crucial because different insects have different life stage timings. Rapid identification in the immature larvae stages would drastically cut time in criminal investigations as it would eliminate the need to rear larvae to adult flies to determine the species. Cuticular hydrocarbon analysis on 1st instar larvae has been applied to three forensically important blowflies; Lucilia sericata, Calliphora vicina and Calliphora vomitoria, using gas chromatography-mass spectrometry (GC-MS) and principal component analysis (PCA). The results show that each species holds a distinct "fingerprint" hydrocarbon profile, allowing for accurate identification to be established in 1-day old larvae, when it can be challenging to apply morphological criteria. Consequently, this GC-MS based technique could accelerate and strengthen the identification process, not only for forensically important species, but also for other entomological samples which are hard to identify using morphological features. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  19. Multiple Criteria Decision Analysis for Health Care Decision Making--An Introduction: Report 1 of the ISPOR MCDA Emerging Good Practices Task Force.

    PubMed

    Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten

    2016-01-01

    Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.

  20. Rapid In Situ Profiling of Lipid C═C Location Isomers in Tissue Using Ambient Mass Spectrometry with Photochemical Reactions.

    PubMed

    Tang, Fei; Guo, Chengan; Ma, Xiaoxiao; Zhang, Jian; Su, Yuan; Tian, Ran; Shi, Riyi; Xia, Yu; Wang, Xiaohao; Ouyang, Zheng

    2018-05-01

    Rapid and in situ profiling of lipids using ambient mass spectrometry (AMS) techniques has great potential for clinical diagnosis, biological studies, and biomarker discovery. In this study, the online photochemical reaction involving carbon-carbon double bonds was coupled with a surface sampling technique to develop a direct tissue-analysis method with specificity to lipid C═C isomers. This method enabled the in situ analysis of lipids from the surface of various tissues or tissue sections, which allowed the structural characterization of lipid isomers within 2 min. Under optimized reaction conditions, we have established a method for the relative quantitation of lipid C═C location isomers by comparing the abundances of the diagnostic ions arising from each isomer, which has been proven effective through the established linear relationship ( R 2 = 0.999) between molar ratio and diagnostic ion ratio of the FA 18:1 C═C location isomers. This method was then used for the rapid profiling of unsaturated lipid C═C isomers in the sections of rat brain, lung, liver, spleen, and kidney, as well as in normal and diseased rat tissues. Quantitative information on FA 18:1 and PC 16:0-18:1 C═C isomers was obtained, and significant differences were observed between different samples. To the best of our knowledge, this is the first study to report the direct analysis of lipid C═C isomers in tissues using AMS. Our results demonstrated that this method can serve as a rapid analytical approach for the profiling of unsaturated lipid C═C isomers in biological tissues and should contribute to functional lipidomics and clinical diagnosis.

  1. Quantifying Residual Stresses by Means of Thermoelastic Stress Analysis

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Baaklini, George Y.

    2001-01-01

    This study focused on the application of the Thermoelastic Stress Analysis (TSA) technique as a tool for assessing the residual stress state of structures. TSA is based on the fact that materials experience small temperature changes when compressed or expanded. When a structure is cyclically loaded, a surface temperature profile results which correlates to the surface stresses. The cyclic surface temperature is measured with an infrared camera. Traditionally, the amplitude of a TSA signal was theoretically defined to be linearly dependent on the cyclic stress amplitude. Recent studies have established that the temperature response is also dependent on the cyclic mean stress (i.e., the static stress state of the structure). In a previous study by the authors, it was shown that mean stresses significantly influenced the TSA results for titanium- and nickel-based alloys. This study continued the effort of accurate direct measurements of the mean stress effect by implementing various experimental modifications. In addition, a more in-depth analysis was conducted which involved analyzing the second harmonic of the temperature response. By obtaining the amplitudes of the first and second harmonics, the stress amplitude and the mean stress at a given point on a structure subjected to a cyclic load can be simultaneously obtained. The experimental results showed good agreement with the theoretical predictions for both the first and second harmonics of the temperature response. As a result, confidence was achieved concerning the ability to simultaneously obtain values for the static stress state as well as the cyclic stress amplitude of structures subjected to cyclic loads using the TSA technique. With continued research, it is now feasible to establish a protocol that would enable the monitoring of residual stresses in structures utilizing TSA.

  2. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    PubMed

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Multiple-wavelength neutron holography with pulsed neutrons

    PubMed Central

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-01-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering—that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique. PMID:28835917

  5. Time series modeling in traffic safety research.

    PubMed

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Multiple-wavelength neutron holography with pulsed neutrons.

    PubMed

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-08-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering-that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF 2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique.

  7. Trace elemental analysis of Indian natural moonstone gems by PIXE and XRD techniques.

    PubMed

    Venkateswara Rao, R; Venkateswarulu, P; Kasipathi, C; Sivajyothi, S

    2013-12-01

    A selected number of Indian Eastern Ghats natural moonstone gems were studied with a powerful nuclear analytical and non-destructive Proton Induced X-ray Emission (PIXE) technique. Thirteen elements, including V, Co, Ni, Zn, Ga, Ba and Pb, were identified in these moonstones and may be useful in interpreting the various geochemical conditions and the probable cause of their inceptions in the moonstone gemstone matrix. Furthermore, preliminary XRD studies of different moonstone patterns were performed. The PIXE technique is a powerful method for quickly determining the elemental concentration of a substance. A 3MeV proton beam was employed to excite the samples. The chemical constituents of moonstones from parts of the Eastern Ghats geological formations of Andhra Pradesh, India were determined, and gemological studies were performed on those gems. The crystal structure and the lattice parameters of the moonstones were estimated using X-Ray Diffraction studies, trace and minor elements were determined using the PIXE technique, and major compositional elements were confirmed by XRD. In the present work, the usefulness and versatility of the PIXE technique for research in geo-scientific methodology is established. © 2013 Elsevier Ltd. All rights reserved.

  8. [THE MODES OF EVALUATION OF TYPE OF DEHYDRATION IN CHILDREN HOSPITALIZED BECAUSE OF ACUTE INTESTINAL INFECTION].

    PubMed

    Krieger, E A; Samodova, O V; Gulakova, N N; Aruiev, A B; Krylova, L A; Titova, L V

    2015-11-01

    Every year about 800,000 cases of intestinal infections end in lethal outcome due to dehydration. The different types of dehydration acquire differential approach to correction. Everywhere there is no application of routine detection of osmolarity of blood plasma under exicosis in children in view of absence of possibility of instrumental measurement. The search of techniques is needed to make it possible to indirectly detect types of dehydration in children hospitalized because of acute intestinal infection with purpose to apply rationale therapy of water-electrolyte disorders. The sampling of 32 patients with intestinal infections accompanied with signs of exicosis degree I-III was examined. The detection of osmolarity of blood was implemented by instrumental technique using gas analyzer ABL 800 Flex (Radiometer; Denmark) and five estimate techniques according to results of biochemical analysis of blood. The differences in precision of measurement of osmolarity of blood plasma by instrumental and estimate techniques were compared using Bland-Altman graphic technique. It is established that formula: 2x[Na+kp] + [glucosekp] (mmol/l) is the most recise. Its application provided results comparable with values detected by instrumental mode.

  9. Autoclave decomposition method for metals in soils and sediments.

    PubMed

    Navarrete-López, M; Jonathan, M P; Rodríguez-Espinosa, P F; Salgado-Galeana, J A

    2012-04-01

    Leaching of partially leached metals (Fe, Mn, Cd, Co, Cu, Ni, Pb, and Zn) was done using autoclave technique which was modified based on EPA 3051A digestion technique. The autoclave method was developed as an alternative to the regular digestion procedure passed the safety norms for partial extraction of metals in polytetrafluoroethylene (PFA vessel) with a low constant temperature (119.5° ± 1.5°C) and the recovery of elements were also precise. The autoclave method was also validated using two Standard Reference Materials (SRMs: Loam Soil B and Loam Soil D) and the recoveries were equally superior to the traditionally established digestion methods. Application of the autoclave was samples from different natural environments (beach, mangrove, river, and city soil) to reproduce the recovery of elements during subsequent analysis.

  10. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  11. Photoluminescence kinetics in CdS nanoclusters formed by the Langmuir-Blodgett technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarubanov, A. A., E-mail: alexsundr@mail.ru; Zhuravlev, K. S.

    2015-03-15

    The photoluminescence kinetics in CdS nanocrystals produced by the Langmuir-Blodgett technique is studied at a temperature of 5 K. The photoluminescence kinetics is described by the sum of two exponential functions, with characteristic times of about 30 and 160 ns. It is found that the fast and slow decay times become longer, as the nanocrystal size increases. Analysis of the data shows that the fast decay time is controlled by trion recombination in nanocrystals with defects, whereas the slow decay time is controlled by the annihilation of optically inactive excitons in nanocrystals without defects. It is established that, as themore » nanocrystal size is decreased, the fraction of imperfect nanocrystals is reduced because of an increase in the energy of defect formation.« less

  12. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  13. Correlation between macro- and nano-scopic measurements of carbon nanostructured paper elastic modulus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Omar, Yamila M.; Al Ghaferi, Amal, E-mail: aalghaferi@masdar.ac.ae, E-mail: mchiesa@masdar.ac.ae; Chiesa, Matteo, E-mail: aalghaferi@masdar.ac.ae, E-mail: mchiesa@masdar.ac.ae

    2015-07-20

    Extensive work has been done in order to determine the bulk elastic modulus of isotropic samples from force curves acquired with atomic force microscopy. However, new challenges are encountered given the development of new materials constructed of one-dimensional anisotropic building blocks, such as carbon nanostructured paper. In the present work, we establish a reliable framework to correlate the elastic modulus values obtained by amplitude modulation atomic force microscope force curves, a nanoscopic technique, with that determined by traditional macroscopic tensile testing. In order to do so, several techniques involving image processing, statistical analysis, and simulations are used to find themore » appropriate path to understand how macroscopic properties arise from anisotropic nanoscale components, and ultimately, being able to calculate the value of bulk elastic modulus.« less

  14. Evaluation of a new eastern blotting technique for the analysis of ginsenoside Re in American ginseng berry pulp extracts.

    PubMed

    Morinaga, Osamu; Uto, Takuhiro; Yuan, Chun-Su; Tanaka, Hiroyuki; Shoyama, Yukihiro

    2010-06-01

    A new eastern blotting technique has been established for ginsenoside Re (G-Re) contained in American ginseng berry pulp extracts. G-Re in American ginseng berry pulp was extracted using 100% methanol, 100% ethanol, 50% aqueous methanol, and 50% aqueous ethanol. The combined crude extracts were applied onto a polyethersulfone membrane and developed using the methanol-water-acetic acid solvent system (45:55:1 v/v). Separated components were immunostained using anti-G-Re monoclonal antibody. G-Re was first specifically detected and then quantitatively analyzed using NIH Imaging software. We also confirmed that the most suitable solvent was 50% aqueous methanol for extracting G-Re from American ginseng berry pulp. (c) 2009 Elsevier B.V. All rights reserved.

  15. A rheumatoid arthritis study by Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Carvalho, Carolina S.; Silva, Ana Carla A.; Santos, Tatiano J. P. S.; Martin, Airton A.; dos Santos Fernandes, Ana Célia; Andrade, Luís E.; Raniero, Leandro

    2012-01-01

    Rheumatoid arthritis is a systemic inflammatory disease of unknown causes and a new methods to identify it in early stages are needed. The main purpose of this work is the biochemical differentiation of sera between normal and RA patients, through the establishment of a statistical method that can be appropriately used for serological analysis. The human sera from 39 healthy donors and 39 rheumatics donors were collected and analyzed by Fourier Transform Infrared Spectroscopy. The results show significant spectral variations with p<0.05 in regions corresponding to protein, lipids and immunoglobulins. The technique of latex particles, coated with human IgG and monoclonal anti-CRP by indirect agglutination known as FR and CRP, was performed to confirm possible false-negative results within the groups, facilitating the statistical interpretation and validation of the technique.

  16. Analysis of medieval limestone sculpture from southwestern France and the Paris Basin by NAA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmes, L.; Harbottle, G.

    1994-12-31

    Compositional characterization of limestone from sources known to medieval craftsmen and from the monuments they built can be used in conjunction with stylistic and iconographic criteria to infer geographic origin of sculptures that have lost their histories. Limestone from 47 quarrying locations in France and from numerous medieval monuments have been subjected to neutron activation analysis (NAA) to form the nucleus of the Brookhaven Limestone Database. Even though the method and techniques of NAA are well established, this paper briefly summarizes the parameters and experimental conditions useful for determining those concentration variables for which limestone from different sources exhibits significantmore » and reproducible differences.« less

  17. LANDSAT technology transfer to the private and public sectors through community colleges and other locally available institutions

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator)

    1980-01-01

    Major first year accomplishments are summarized and plans are provided for the next 12-month period for a program established by NASA with the Environmental Research Institute of Michigan to investigate methods of making LANDSAT technology readily available to a broader set of private sector firms through local community colleges. The program applies a network where the major participants are NASA, university or research institutes, community colleges, and obtain hands-on training in LANDSAT data analysis techniques, using a desk-top, interactive remote analysis station which communicates with a central computing facility via telephone line, and provides for generation of land cover maps and data products via remote command.

  18. Stochastic analysis of a novel nonautonomous periodic SIRI epidemic system with random disturbances

    NASA Astrophysics Data System (ADS)

    Zhang, Weiwei; Meng, Xinzhu

    2018-02-01

    In this paper, a new stochastic nonautonomous SIRI epidemic model is formulated. Given that the incidence rates of diseases may change with the environment, we propose a novel type of transmission function. The main aim of this paper is to obtain the thresholds of the stochastic SIRI epidemic model. To this end, we investigate the dynamics of the stochastic system and establish the conditions for extinction and persistence in mean of the disease by constructing some suitable Lyapunov functions and using stochastic analysis technique. Furthermore, we show that the stochastic system has at least one nontrivial positive periodic solution. Finally, numerical simulations are introduced to illustrate our results.

  19. Development of new materials for turbopump bearings

    NASA Technical Reports Server (NTRS)

    Maurer, R. E.; Pallini, R. A.

    1985-01-01

    The life requirement for the angular contact ball bearings in the Space Shuttle Main Engine (SSME) high pressure oxygen turbopump (HPOTP) is 7.5 hours. In actual operation, significantly shorter service life was experienced. The objective is to identify bearing materials and/or materials processing techniques offering signficant potential for extending HPOTP bearing performance life. Interactive thermomechanical analysis of the HPOTP bearing-shaft system was performed with the SHABERTH computer program. Bearing fatigue life, ball-race contact stress, heat generation rate, bulk ring temperatures and circumferential stress in the inner rings were quantified as functions of radial load, thrust load and ball-race contact friction. Criteria established from the output of this analysis are being used for material candidate selection.

  20. [Application of near-infrared spectroscopy to agriculture and food analysis].

    PubMed

    Wang, Duo-jia; Zhou, Xiang-yang; Jin, Tong-ming; Hu, Xiang-na; Zhong, Jiao-e; Wu, Qi-tang

    2004-04-01

    Near-Infrared Spectroscopy (NIRS) is the most rapidly developing and the most noticeable spectrographic technique in the 90's (the last century). Its principle and characteristics were explained in this paper, and the development of NIRS instrumentation, the methodology of spectrum pre-processing, as well as the chemical metrology were also introduced. The anthors mainly summarized the applications to agriculture and food, especially in-line analysis methods, which have been used in production procedure by fiber optics. The authors analyzed the NIRS application status in China, and made the first proposal to establish information sharing mode between central database and end-user by using network technology and concentrating valuable resources.

Top