Science.gov

Sample records for activation analysis applied

  1. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2002-01-01

    This viewgraph report presents an overview of activities and accomplishments of NASA's Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group. Expertise in this group focuses on high-fidelity fluids design and analysis with application to space shuttle propulsion and next generation launch technologies. Topics covered include: computational fluid dynamics research and goals, turbomachinery research and activities, nozzle research and activities, combustion devices, engine systems, MDA development and CFD process improvements.

  2. Overview af MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Griffin, Lisa; Williams, Robert

    2004-01-01

    This paper presents viewgraphs on NASA Marshall Space Flight Center's Applied Fluid Dynamics Analysis Group Activities. The topics include: 1) Status of programs at MSFC; 2) Fluid Mechanics at MSFC; 3) Relevant Fluid Dynamics Activities at MSFC; and 4) Shuttle Return to Flight.

  3. Overview of MSFC's Applied Fluid Dynamics Analysis Group Activities

    NASA Technical Reports Server (NTRS)

    Garcia, Roberto; Wang, Tee-See; Griffin, Lisa; Turner, James E. (Technical Monitor)

    2001-01-01

    This document is a presentation graphic which reviews the activities of the Applied Fluid Dynamics Analysis Group at Marshall Space Flight Center (i.e., Code TD64). The work of this group focused on supporting the space transportation programs. The work of the group is in Computational Fluid Dynamic tool development. This development is driven by hardware design needs. The major applications for the design and analysis tools are: turbines, pumps, propulsion-to-airframe integration, and combustion devices.

  4. Activity anorexia: An interplay between basic and applied behavior analysis

    PubMed Central

    Pierce, W. David; Epling, W. Frank; Dews, Peter B.; Estes, William K.; Morse, William H.; Van Orman, Willard; Herrnstein, Richard J.

    1994-01-01

    The relationship between basic research with nonhumans and applied behavior analysis is illustrated by our work on activity anorexia. When rats are fed one meal a day and allowed to run on an activity wheel, they run excessively, stop eating, and die of starvation. Convergent evidence, from several different research areas, indicates that the behavior of these animals and humans who self-starve is functionally similar. A biobehavioral theory of activity anorexia is presented that details the cultural contingencies, behavioral processes, and physiology of anorexia. Diagnostic criteria and a three-stage treatment program for activity-based anorexia are outlined. The animal model permits basic research on anorexia that for practical and ethical reasons cannot be conducted with humans. Thus, basic research can have applied importance. PMID:22478169

  5. Applying observations of work activity in designing prototype data analysis tools

    SciTech Connect

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  6. A laser-induced repetitive fast neutron source applied for gold activation analysis

    SciTech Connect

    Lee, Sungman; Park, Sangsoon; Lee, Kitae; Cha, Hyungki

    2012-12-15

    A laser-induced repetitively operated fast neutron source was developed for applications in laser-driven nuclear physics research. The developed neutron source, which has a neutron yield of approximately 4 Multiplication-Sign 10{sup 5} n/pulse and can be operated up to a pulse repetition rate of 10 Hz, was applied for a gold activation analysis. Relatively strong delayed gamma spectra of the activated gold were measured at 333 keV and 355 keV, and proved the possibility of the neutron source for activation analyses. In addition, the nuclear reactions responsible for the measured gamma spectra of gold were elucidated by the 14 MeV fast neutrons resulting from the D(t,n)He{sup 4} nuclear reaction, for which the required tritium originated from the primary fusion reaction, D(d,p)T{sup 3}.

  7. A laser-induced repetitive fast neutron source applied for gold activation analysis.

    PubMed

    Lee, Sungman; Park, Sangsoon; Lee, Kitae; Cha, Hyungki

    2012-12-01

    A laser-induced repetitively operated fast neutron source was developed for applications in laser-driven nuclear physics research. The developed neutron source, which has a neutron yield of approximately 4 × 10(5) n/pulse and can be operated up to a pulse repetition rate of 10 Hz, was applied for a gold activation analysis. Relatively strong delayed gamma spectra of the activated gold were measured at 333 keV and 355 keV, and proved the possibility of the neutron source for activation analyses. In addition, the nuclear reactions responsible for the measured gamma spectra of gold were elucidated by the 14 MeV fast neutrons resulting from the D(t,n)He(4) nuclear reaction, for which the required tritium originated from the primary fusion reaction, D(d,p)T(3). PMID:23277984

  8. Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  9. The Fourier analysis applied to the relationship between (7)Be activity in the Serbian atmosphere and meteorological parameters.

    PubMed

    Rajačić, M M; Todorović, D J; Krneta Nikolić, J D; Janković, M M; Djurdjević, V S

    2016-09-01

    Air sample monitoring in Serbia, Belgrade started in the 1960s, while (7)Be activity in air and total (dry and wet) deposition has been monitored for the last 22 years by the Environment and Radiation Protection Department of the Institute for Nuclear Sciences, Vinca. Using this data collection, the changes of the (7)Be activity in the air and the total (wet and dry) deposition samples, as well as their correlation with meteorological parameters (temperature, pressure, cloudiness, sunshine duration, precipitation and humidity) that affect (7)Be concentration in the atmosphere, were mathematically described using the Fourier analysis. Fourier analysis confirmed the expected; the frequency with the largest intensity in the harmonic spectra of the (7)Be activity corresponds to a period of 1 year, the same as the largest intensity frequency in Fourier series of meteorological parameters. To analyze the quality of the results produced by the Fourier analysis, we compared the measured values of the parameters with the values calculated according to the Fourier series. Absolute deviations between measured and predicted mean monthly values are in range from 0.02 mBq/m(3) to 0.7 mBq/m(3) for (7)Be activity in air, and 0.01 Bq/m(2) and 0.6 Bq/m(2) for (7)Be activity in deposition samples. Relatively good agreement of measured and predicted results offers the possibility of prediction of the (7)Be activity. PMID:27396670

  10. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  11. Activities report in applied physics

    NASA Astrophysics Data System (ADS)

    Research concerning acoustics, heat, architecture, materials research, and (optical) instrumentation is presented; active noise control and acoustic path identification were investigated. Energy conservation, solar energy, and building physics activities were carried out. Ultraviolet absorbing glasses, glass fibers, sheet glass, and aluminium and silicon oxynitrides, were studied. Glass fiber based sensor and laser applications, and optical space-instrumentation are discussed. Signal processing, sensors, and integrated electronics applications were developed. Scale model experiments for flow induced noise and vibrations, caused by engines, ventilators, wind turbines, and propellers, were executed. A multispectral charge coupled device airborne scanner, with four modules (one for forward observations) is described. A ground radar, based on seismic exploration signal processing and used for the location of pipes, sewers and cables, was developed.

  12. Prolonged applied potential to anode facilitate selective enrichment of bio-electrochemically active Proteobacteria for mediating electron transfer: microbial dynamics and bio-catalytic analysis.

    PubMed

    Kannaiah Goud, R; Mohan, S Venkata

    2013-06-01

    Prolonged application of poised potential to anode was evaluated to understand the influence of applied potentials [500 mV (E500); 1000 mV (E1000); 2000 mV (E2000)] on bio-electrogenic activity of microbial fuel cell (MFC) and the resulting dynamics in microbial community in comparison to control operation. E1000 system documented higher electrogenic activity (309 mW/m(2)) followed by E500 (143 mW/m(2)), E2000 (112 mW/m(2)) and control (65 mW/m(2)) operations. The improved power output at optimum applied potential (1000mV) might be attributed to the enrichment of electrochemically active bacteria majorly belonging to the phylum Proteobacteria with less extent of Firmicutes which helped in effective electron (mediated) transfer through release of exogenous shuttlers. Improved bio-electrogenic activity due to enrichment at 1000mV applied potential also correlated well with the observed cyctochrome-c peaks on the voltamatogram, lower ion ohmic losses and bio-electro kinetic analysis. Electric-shock at higher applied potential (E2000) resulted in the survival of less number of microbial species leading to lower electrogenesis.

  13. Conversation Analysis and Applied Linguistics.

    ERIC Educational Resources Information Center

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  14. Applied PhD Research in a Work-Based Environment: An Activity Theory-Based Analysis

    ERIC Educational Resources Information Center

    Granata, S. N.; Dochy, F.

    2016-01-01

    Activity theory is used to compare PhD undertaken at university, that is, academic PhD, with PhD performed in collaboration with industry, that is, semi-industrial PhD. The research is divided into a literature review and a case study. Semi-industrial and academic PhD are modelled as activity systems, and differences are highlighted in terms of…

  15. Activation analysis

    SciTech Connect

    Alfassi, Z.B. . Dept. of Nuclear Engineering)

    1990-01-01

    This volume contains 16 chapters on the application of activation analysis in the fields of life sciences, biological materials, coal and its effluents, environmental samples, archaeology, material science, and forensics. Each chapter is processed separately for the data base.

  16. Increasing Active Student Responding in a University Applied Behavior Analysis Course: The Effect of Daily Assessment and Response Cards on End of Week Quiz Scores

    ERIC Educational Resources Information Center

    Malanga, Paul R.; Sweeney, William J.

    2008-01-01

    The study compared the effects of daily assessment and response cards on average weekly quiz scores in an introduction to applied behavior analysis course. An alternating treatments design (Kazdin 1982, "Single-case research designs." New York: Oxford University Press; Cooper et al. 2007, "Applied behavior analysis." Upper Saddle River:…

  17. On differentiation in applied behavior analysis

    PubMed Central

    Fawcett, Stephen B.

    1985-01-01

    Distinct types of activity in the field of applied behavior analysis are noted and discussed. Four metaphorical types of activity are considered: prospecting, farming, building, and guiding. Prospecting consists of time-limited exploration of a variety of beaviors, populations, or settings. Farming consists of producing new behaviors in the same setting using independent variables provided by the researchers or normally available in the setting. Building consists of combining procedural elements to create new programs or systems or to rehabilitate aspects of existing programs. Guiding involves pointing out connections between the principles of human behavior and the problems, populations, settings, and procedures with which researchers are (or could be) working. Advantages of each sphere are noted, and benefits of this division of labor to the field as a whole are discussed. PMID:22478631

  18. Active magnetic bearings applied to industrial compressors

    NASA Technical Reports Server (NTRS)

    Kirk, R. G.; Hustak, J. F.; Schoeneck, K. A.

    1993-01-01

    The design and shop test results are given for a high-speed eight-stage centrifugal compressor supported by active magnetic bearings. A brief summary of the basic operation of active magnetic bearings and the required rotor dynamics analysis are presented with specific attention given to design considerations for optimum rotor stability. The concerns for retrofits of magnetic bearings in existing machinery are discussed with supporting analysis of a four-stage centrifugal compressor. The current status of industrial machinery in North America using this new support system is presented and recommendations are given on design and analysis requirements for successful machinery operation of either retrofit or new design turbomachinery.

  19. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  20. Concept analysis of culture applied to nursing.

    PubMed

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  1. Artificial intelligence technologies applied to terrain analysis

    SciTech Connect

    Wright, J.C. ); Powell, D.R. )

    1990-01-01

    The US Army Training and Doctrine Command is currently developing, in cooperation with Los Alamos National Laboratory, a Corps level combat simulation to support military analytical studies. This model emphasizes high resolution modeling of the command and control processes, with particular attention to architectural considerations that enable extension of the model. A planned future extension is the inclusion of an computer based planning capability for command echelons that can be dynamical invoked during the execution of then model. Command and control is the process through which the activities of military forces are directed, coordinated, and controlled to achieve the stated mission. To perform command and control the commander must understand the mission, perform terrain analysis, understand his own situation and capabilities as well as the enemy situation and his probable actions. To support computer based planning, data structures must be available to support the computer's ability to understand'' the mission, terrain, own capabilities, and enemy situation. The availability of digitized terrain makes it feasible to apply artificial intelligence technologies to emulate the terrain analysis process, producing data structures for uses in planning. The work derived thus for to support the understanding of terrain is the topic of this paper. 13 refs., 5 figs., 6 tabs.

  2. Defining applied behavior analysis: An historical analogy

    PubMed Central

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within each criteria. To conclude, the problems of a 19th century form of empiricism in medicine are related to current practices in applied behavior analysis. PMID:22478557

  3. Applied mathematics analysis of the multibody systems

    NASA Astrophysics Data System (ADS)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  4. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  5. Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  6. Caldwell University's Department of Applied Behavior Analysis.

    PubMed

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  7. Epithermal neutron activation, radiometric, correlation and principal component analysis applied to the distribution of major and trace elements in some igneous and metamorphic rocks from Romania.

    PubMed

    Cristache, C I; Duliu, O G; Culicov, O A; Frontasyeva, M V; Ricman, C; Toma, M

    2009-05-01

    Six major (Na, Al, K, Ca, Ti, Fe) and 28 trace (Sc, Cr, V, Mn, Co, Zn, Cu, As, Br, Sr, Rb, Zr, Mo, Sn, Sb, Ba, Cs, La, Ce, Nd, Eu, Sm, Tb, Hf, Ta, W, Th and U) elements were determined by epithermal neutron activation analysis (ENAA) in nine Meridional Carpathian and Macin Mountains samples of igneous and metamorphic rocks. Correlation and principal factor analysis were used to interpret data while natural radionuclides radiometry shows a good correlation with ENAA results.

  8. What happened to analysis in applied behavior analysis?

    PubMed

    Pierce, W D; Epling, W F

    1980-01-01

    This paper addresses the current help-oriented focus of researchers in applied behavior analysis. Evidence from a recent volume of JABA suggests that analytic behavior is at low levels in applied analysis while cure-help behavior is at high strength. This low proportion of scientific behavior is apparantly related to cure-help contingencies set by institutions and agencies of help and the editorial policies of JABA itself. These contingencies have favored the flight to real people and a concern with client gains, evaluation and outcome strategies rather than the analysis of contingencies of reinforcement controlling human behavior. In this regard, the paper documents the current separation of applied behavior analysis from the experimental analysis of behavior. There is limited use of basic principles in applied analysis today and almost no reference to the current research in the experimental analysis of behavior involving concurrent operants and adjunctive behavior. This divorce of applied behavior research and the experimental analysis of behavior will mitigate against progress toward a powerful technology of behavior. In order to encourage a return to analysis in applied research, there is a need to consider the objectives of applied behavior analysis. The original purpose of behavioral technology is examined and a re-definition of the concept of "social importance" is presented which can direct applied researchers toward an analytic focus. At the same time a change in the publication policies of applied journals such as JABA toward analytic research and the design of new educational contingencies for students will insure the survival of analysis in applied behavior analysis. PMID:22478471

  9. Applied Pharmaceutical Analysis India 2014 conference report.

    PubMed

    Kole, Prashant; Barot, Deepak; Kotecha, Jignesh; Raina, Vijay; Rao, Mukkavilli; Yadav, Manish

    2014-01-01

    Applied Pharmaceutical Analysis (APA) India 23-26 February 2014, Ahmedabad, India The fifth Applied Pharmaceutical Analysis (APA) India meeting was held in February 2014 at Hyatt Ahmedabad, India. With the theme of 'The Science of Measurement: Current status and Future trends in Bioanalysis, Biotransformation and Drug Discovery Platforms', the conference was attended by over 160 delegates. The agenda comprised advanced and relevant research topics in the key areas of bioanalysis and drug metabolism. APA India 2014 provided a unique platform for networking and professional linking to participants, innovators and policy-makers. As part of the global research community, APA India continues to grow and receive considerable attention from the drug discovery and development community of India.

  10. Epithermal neutron activation, radiometric, correlation and principal component analysis applied to the distribution of major and trace elements in some igneous and metamorphic rocks from Romania.

    PubMed

    Cristache, C I; Duliu, O G; Culicov, O A; Frontasyeva, M V; Ricman, C; Toma, M

    2009-05-01

    Six major (Na, Al, K, Ca, Ti, Fe) and 28 trace (Sc, Cr, V, Mn, Co, Zn, Cu, As, Br, Sr, Rb, Zr, Mo, Sn, Sb, Ba, Cs, La, Ce, Nd, Eu, Sm, Tb, Hf, Ta, W, Th and U) elements were determined by epithermal neutron activation analysis (ENAA) in nine Meridional Carpathian and Macin Mountains samples of igneous and metamorphic rocks. Correlation and principal factor analysis were used to interpret data while natural radionuclides radiometry shows a good correlation with ENAA results. PMID:19231213

  11. Tropospheric Delay Raytracing Applied in VLBI Analysis

    NASA Astrophysics Data System (ADS)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  12. Research on Mobile Learning Activities Applying Tablets

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Juskeviciene, Anita; Bireniene, Virginija

    2015-01-01

    The paper aims to present current research on mobile learning activities in Lithuania while implementing flagship EU-funded CCL project on application of tablet computers in education. In the paper, the quality of modern mobile learning activities based on learning personalisation, problem solving, collaboration, and flipped class methods is…

  13. Positive Behavior Support and Applied Behavior Analysis

    PubMed Central

    Johnston, J.M; Foxx, Richard M; Jacobson, John W; Green, Gina; Mulick, James A

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the PBS movement. PMID:22478452

  14. Wavelet analysis applied to the IRAS cirrus

    NASA Technical Reports Server (NTRS)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  15. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  16. Progressive-Ratio Schedules and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  17. Ants cushion applied stress by active rearrangements

    NASA Astrophysics Data System (ADS)

    Liu, Zhongyang; Hyatt, John; Mlot, Nathan; Gerov, Michael; Fernandez-Nieves, Alberto; Hu, David

    2013-11-01

    Fire ants, Solenopsis invicta, link their bodies together to form waterproof rafts, which in turn drip, spread, and coagulate, demonstrating properties of an active material that can change state from a liquid to a solid. This soft-matter phase transition is important when the raft interacts with environmental forces such as raindrops and crashing waves. We study this active behavior through plate-on-plate rheology on the ants, extracting the active components by comparison with the rheological behavior of a collection of dead ants. In controlled shear tests, both and live and dead ants show properties of a non-Newtonian fluid, specifically, shear-thinning behavior. In oscillatory tests, live ants exhibit a rare behavior in which their storage modulus (G') and loss modulus (G'') have approximately the same value over three orders magnitudes of frequency and two orders of magnitude of strain, indicating the ants are neither fluid nor solid. In comparison, dead ants are more solid-like, with a storage modulus twice as large as their loss modulus. This striking active behavior arises from rearrangement of their bodies and storage and dissipation of energy with the ants' muscles.

  18. Active Learning in Applied Ethics Instruction.

    ERIC Educational Resources Information Center

    Brisin, Tom

    1995-01-01

    Describes an active learning strategy used in a college ethics course to direct students into exploring multiple perspectives in case studies. The technique expands on collaborative learning methods by adding role-playing and reflective writing. The technique has been successful in a journalism ethics course and is adaptable to any field in which…

  19. Augmented multivariate image analysis applied to quantitative structure-activity relationship modeling of the phytotoxicities of benzoxazinone herbicides and related compounds on problematic weeds.

    PubMed

    Freitas, Mirlaine R; Matias, Stella V B G; Macedo, Renato L G; Freitas, Matheus P; Venturin, Nelson

    2013-09-11

    Two of major weeds affecting cereal crops worldwide are Avena fatua L. (wild oat) and Lolium rigidum Gaud. (rigid ryegrass). Thus, development of new herbicides against these weeds is required; in line with this, benzoxazinones, their degradation products, and analogues have been shown to be important allelochemicals and natural herbicides. Despite earlier structure-activity studies demonstrating that hydrophobicity (log P) of aminophenoxazines correlates to phytotoxicity, our findings for a series of benzoxazinone derivatives do not show any relationship between phytotoxicity and log P nor with other two usual molecular descriptors. On the other hand, a quantitative structure-activity relationship (QSAR) analysis based on molecular graphs representing structural shape, atomic sizes, and colors to encode other atomic properties performed very accurately for the prediction of phytotoxicities of these compounds against wild oat and rigid ryegrass. Therefore, these QSAR models can be used to estimate the phytotoxicity of new congeners of benzoxazinone herbicides toward A. fatua L. and L. rigidum Gaud.

  20. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software. PMID:23625877

  1. Applied spectrophotometry: analysis of a biochemical mixture.

    PubMed

    Trumbo, Toni A; Schultz, Emeric; Borland, Michael G; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the concentration of a given species (RNA, DNA, protein) in isolation (a contrived circumstance) as opposed to determining that concentration in the presence of other species (a more realistic situation). To present the student with a more realistic laboratory experience and also to fill a hole that we believe exists in student experience prior to reaching a biochemistry course, we have devised a three week laboratory experience designed so that students learn to: connect laboratory practice with theory, apply the Beer-Lambert-Bougert Law to biochemical analyses, demonstrate the utility and limitations of example quantitative colorimetric assays, demonstrate the utility and limitations of UV analyses for biomolecules, develop strategies for analysis of a solution of unknown biomolecular composition, use digital micropipettors to make accurate and precise measurements, and apply graphing software.

  2. Graphical analysis of reversible radioligand binding from time-activity measurements applied to (N- sup 11 C-methyl)-(-)-cocaine PET studies in human subjects

    SciTech Connect

    Logan, J.; Fowler, J.S.; Volkow, N.D.; Wolf, A.P.; Dewey, S.L.; Schlyer, D.J.; MacGregor, R.R.; Hitzemann, R.; Bendriem, B.; Gatley, S.J. )

    1990-09-01

    A graphical method of analysis applicable to ligands that bind reversibly to receptors or enzymes requiring the simultaneous measurement of plasma and tissue radioactivities for multiple times after the injection of a radiolabeled tracer is presented. It is shown that there is a time t after which a plot of integral of t0ROI(t')dt'/ROI(t) versus integral of t0Cp(t')dt'/ROI(t) (where ROI and Cp are functions of time describing the variation of tissue radioactivity and plasma radioactivity, respectively) is linear with a slope that corresponds to the steady-state space of the ligand plus the plasma volume,.Vp. For a two-compartment model, the slope is given by lambda + Vp, where lambda is the partition coefficient and the intercept is -1/(kappa 2(1 + Vp/lambda)). For a three-compartment model, the slope is lambda(1 + Bmax/Kd) + Vp and the intercept is -(1 + Bmax/Kd)/k2 + (koff(1 + Kd/Bmax))-1 (1 + Vp/lambda(1 + Bmax/Kd))-1 (where Bmax represents the concentration of ligand binding sites and Kd the equilibrium dissociation constant of the ligand-binding site complex, koff (k4) the ligand-binding site dissociation constant, and k2 is the transfer constant from tissue to plasma). This graphical method provides the ratio Bmax/Kd from the slope for comparison with in vitro measures of the same parameter. It also provides an easy, rapid method for comparison of the reproducibility of repeated measures in a single subject, for longitudinal or drug intervention protocols, or for comparing experimental results between subjects. Although the linearity of this plot holds when ROI/Cp is constant, it can be shown that, for many systems, linearity is effectively reached some time before this. This analysis has been applied to data from (N-methyl-11C)-(-)-cocaine studies in normal human volunteers and the results are compared to the standard nonlinear least-squares analysis.

  3. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  4. Analysis of the interaction between experimental and applied behavior analysis.

    PubMed

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis.

  5. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  6. Introduction: Conversation Analysis in Applied Linguistics

    ERIC Educational Resources Information Center

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  7. [The analysis of possibility to apply new preparations in serologic diagnostic of agent of cholera in working activity of specialized anti-epidemic brigades].

    PubMed

    Mazrukho, A B; Tatarenko, O A; Alekseyeva, L P; Agafonova, V V; Shaly, O A; Pisanov, R V; Aiydinov, G V; Stupina, N A

    2013-05-01

    The approbation of diagnostic preparations on the substrate of monoclonal antibodies developed in the institute was carried out during tactical specialized exercise with building up of units on the basis of mobile complex of specialized anti-epidemic brigades. It is established that diagnostic agglutinating and fluorescent monoclonal immunoglobulins by their sensitivity are equal to polyclonal commercial preparations and can be used at the stages of laboratory diagnostic of cholera both in conditions of stationary laboratory and mobile complex of specialized anti-epidemic brigades. The method of dot immunoanalysis on the substrate of monoclonal antibodies can, on a par with such common methods as immunofluorescence, slide-agglutination and polymerase chain reaction, be applied in complex of methods of express-diagnostic of cholera.

  8. Statistical Uncertainty Analysis Applied to Criticality Calculation

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Susmikanti, Mike; Nursinta, A. W.

    2010-06-22

    In this paper, we present an uncertainty methodology based on a statistical approach, for assessing uncertainties in criticality prediction using monte carlo method due to uncertainties in the isotopic composition of the fuel. The methodology has been applied to criticality calculations with MCNP5 with additional stochastic input of the isotopic fuel composition. The stochastic input were generated using the latin hypercube sampling method based one the probability density function of each nuclide composition. The automatic passing of the stochastic input to the MCNP and the repeated criticality calculation is made possible by using a python script to link the MCNP and our latin hypercube sampling code.

  9. Applying texture analysis to materials engineering problems

    NASA Astrophysics Data System (ADS)

    Knorr, D. B.; Weiland, H.; Szpunar, J. A.

    1994-09-01

    Textures in materials have been studied extensively since the 1930s following the pioneering work of Wassermann.1,2 The modern era of texture measurement started in 1949 with the development of the x-ray pole figure technique for texture measurement by Schultz.3 Finally, modern texture analysis was initiated with the publication by Bunge4 and Roe5 of a mathematical method of pole figure inversion, which is now used to calculate the orientation distribution function (ODF). This article cannot summarize such an extensive body of work, but it does endeavor to provide the background necessary to understand texture analysis; it also illustrates several applications of texture.

  10. Cost Utility Analysis Applied to Library Budgeting.

    ERIC Educational Resources Information Center

    Stitleman, Leonard

    Cost Utility Analysis (CUA) is, basically, an administrative tool to be used in situations where making a choice among meaningful programs is necessary. It does not replace the administrator, but can provide a significant source of data for the decision maker. CUA can be a guide to the selection of an optimal program in terms of available funds,…

  11. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    ERIC Educational Resources Information Center

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  12. Science, Skepticism, and Applied Behavior Analysis

    PubMed Central

    Normand, Matthew P

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687

  13. Thermal analysis applied to irradiated propolis

    NASA Astrophysics Data System (ADS)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; del Mastro, Nélida Lucia

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were 60Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600°C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  14. Reliability analysis applied to structural tests

    NASA Technical Reports Server (NTRS)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  15. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  16. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  17. Multivariate analysis applied to tomato hybrid production.

    PubMed

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  18. Toward applied behavior analysis of life aloft

    NASA Technical Reports Server (NTRS)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  19. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  20. Toward applied behavior analysis of life aloft.

    PubMed

    Brady, J V

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  1. Applying Genomic Analysis to Newborn Screening

    PubMed Central

    Solomon, B.D.; Pineda-Alvarez, D.E.; Bear, K.A.; Mullikin, J.C.; Evans, J.P.

    2012-01-01

    Large-scale genomic analysis such as whole-exome and whole-genome sequencing is becoming increasingly prevalent in the research arena. Clinically, many potential uses of this technology have been proposed. One such application is the extension or augmentation of newborn screening. In order to explore this application, we examined data from 3 children with normal newborn screens who underwent whole-exome sequencing as part of research participation. We analyzed sequence information for 151 selected genes associated with conditions ascertained by newborn screening. We compared findings with publicly available databases and results from over 500 individuals who underwent whole-exome sequencing at the same facility. Novel variants were confirmed through bidirectional dideoxynucleotide sequencing. High-density microarrays (Illumina Omni1-Quad) were also performed to detect potential copy number variations affecting these genes. We detected an average of 87 genetic variants per individual. After excluding artifacts, 96% of the variants were found to be reported in public databases and have no evidence of pathogenicity. No variants were identified that would predict disease in the tested individuals, which is in accordance with their normal newborn screens. However, we identified 6 previously reported variants and 2 novel variants that, according to published literature, could result in affected offspring if the reproductive partner were also a mutation carrier; other specific molecular findings highlight additional means by which genomic testing could augment newborn screening. PMID:23112750

  2. Digital photoelastic analysis applied to implant dentistry

    NASA Astrophysics Data System (ADS)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  3. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  4. Modeling Protein Folding and Applying It to a Relevant Activity

    ERIC Educational Resources Information Center

    Nelson, Allan; Goetze, Jim

    2004-01-01

    The different levels of protein structure that can be easily understood by creating a model that simulates protein folding, which can then be evaluated by applying it to a relevant activity, is presented. The materials required and the procedure for constructing a protein folding model are mentioned.

  5. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement.

  6. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    ERIC Educational Resources Information Center

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  7. Animal Research in the "Journal of Applied Behavior Analysis"

    ERIC Educational Resources Information Center

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  8. B. F. Skinner's contributions to applied behavior analysis

    PubMed Central

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we found that he explicitly or implicitly addressed all seven dimensions of applied behavior analysis. These contributions and the dimensions notwithstanding, he neither incorporated the field's scientific (e.g., analytic) and social dimensions (e.g., applied) into any program of published research such that he was its originator, nor did he systematically integrate, advance, and promote the dimensions so to have been its founder. As the founder of behavior analysis, however, he was the father of applied behavior analysis. PMID:22478444

  9. [Study on dewatering of activated sludge under applied electric field].

    PubMed

    Ji, Xue-Yuan; Wang, Yi-Li; Feng, Jing

    2012-12-01

    For an electro-dewatering process of activated sludge (AS), the effect of pH and conductivity of AS, flocculation conditioning and operation factors of horizontal electric field (voltage magnitude, method of applying electric field and distance between plates) were investigated, and the corresponding optimum electro-dewatering conditions were also obtained. The results showed that the best electro-dewatering effect was achieved for AS without change of its pH value (6.93) and conductivity (1.46 mS x cm(-1)). CPAM conditioning could lead to the increase of 30%-40% in the dewatering rate and accelerate the dewatering process, whereas a slight increase in the electro-dewatering rate. The electro-dewatering rate for conditioned AS reached 83.12% during an electric field applied period of 60 minutes, while this rate for original AS could be 75.31% even the electric field applied period extended to 120 minutes. The delay of applying the electric field had an inhibition effect on the AS electro-dewatering rate. Moreover, the optimum conditions for AS electro-dewatering were followed: CPAM dose of 9 g x kg(-1), electric field strength of 600 V x m(-1), distance between the two plates of 40 mm, dehydration time of 60 minutes. Under above optimum conditions the AS electro-dewatering rate could approach to 85.33% and the moisture content in AS decreased from 99.30% to 95.15% accordingly.

  10. Swimming and other activities: applied aspects of fish swimming performance

    USGS Publications Warehouse

    Castro-Santos, Theodore R.; Farrell, A.P.

    2011-01-01

    Human activities such as hydropower development, water withdrawals, and commercial fisheries often put fish species at risk. Engineered solutions designed to protect species or their life stages are frequently based on assumptions about swimming performance and behaviors. In many cases, however, the appropriate data to support these designs are either unavailable or misapplied. This article provides an overview of the state of knowledge of fish swimming performance – where the data come from and how they are applied – identifying both gaps in knowledge and common errors in application, with guidance on how to avoid repeating mistakes, as well as suggestions for further study.

  11. The Significance of Regional Analysis in Applied Geography.

    ERIC Educational Resources Information Center

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  12. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  13. Negative reinforcement in applied behavior analysis: an emerging technology.

    PubMed Central

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these areas suggests the emergence of an applied technology on negative reinforcement. PMID:3323157

  14. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    PubMed Central

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  15. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  16. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads. PMID:25738742

  17. Applied behavior analysis: New directions from the laboratory

    PubMed Central

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforcement is discussed. Possible difficulties in treatment and management of induced behaviors are considered. Next, the correlation-based law of effect and the implications of relative reinforcement are explored in terms of applied examples. Relative rate of reinforcement is then extended to the literature dealing with concurrent operants. Concurrent operant models may describe human behavior of applied importance, and several techniques for modification of problem behavior are suggested. As a final concern, the paper discusses several new paradigms. While the practical importance of these models is not clear at the moment, it may be that new practical advantages will soon arise. Thus, it is argued that basic research continues to be of theoretical and practical importance to applied behavior analysis. PMID:22478574

  18. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  19. Context, Cognition, and Biology in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  20. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  1. Characterization of anomalies by applying methods of fractal analysis

    SciTech Connect

    Sakuma, M.; Kozma, R.; Kitamura, M.

    1996-01-01

    Fractal analysis is applied in a variety of research fields to characterize nonstationary data. Here, fractal analysis is used as a tool of characterization in time series. The fractal dimension is calculated by Higuchi`s method, and the effect of small data size on accuracy is studied in detail. Three types of fractal-based anomaly indicators are adopted: (a) the fractal dimension, (b) the error of the fractal dimension, and (c) the chi-square value of the linear fitting of the fractal curve in the wave number domain. Fractal features of time series can be characterized by introducing these three measures. The proposed method is applied to various simulated fractal time series with ramp, random, and periodic noise anomalies and also to neutron detector signals acquired in a nuclear reactor. Fractal characterization can successfully supplement conventional signal analysis methods especially if nonstationary and non-Gaussian features of the signal become important.

  2. Recent reinforcement-schedule research and applied behavior analysis

    PubMed Central

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule performance. The paper concludes by extracting from the experiments some more general issues concerning reinforcement schedules in applied research and practice. PMID:16795888

  3. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    PubMed

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  4. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    PubMed

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines. PMID:21175361

  5. Applying activity-based costing to the nuclear medicine unit.

    PubMed

    Suthummanon, Sakesun; Omachonu, Vincent K; Akcin, Mehmet

    2005-08-01

    Previous studies have shown the feasibility of using activity-based costing (ABC) in hospital environments. However, many of these studies discuss the general applications of ABC in health-care organizations. This research explores the potential application of ABC to the nuclear medicine unit (NMU) at a teaching hospital. The finding indicates that the current cost averages 236.11 US dollars for all procedures, which is quite different from the costs computed by using ABC. The difference is most significant with positron emission tomography scan, 463 US dollars (an increase of 96%), as well as bone scan and thyroid scan, 114 US dollars (a decrease of 52%). The result of ABC analysis demonstrates that the operational time (machine time and direct labour time) and the cost of drugs have the most influence on cost per procedure. Clearly, to reduce the cost per procedure for the NMU, the reduction in operational time and cost of drugs should be analysed. The result also indicates that ABC can be used to improve resource allocation and management. It can be an important aid in making management decisions, particularly for improving pricing practices by making costing more accurate. It also facilitates the identification of underutilized resources and related costs, leading to cost reduction. The ABC system will also help hospitals control costs, improve the quality and efficiency of the care they provide, and manage their resources better. PMID:16102243

  6. Applying activity-based costing to the nuclear medicine unit.

    PubMed

    Suthummanon, Sakesun; Omachonu, Vincent K; Akcin, Mehmet

    2005-08-01

    Previous studies have shown the feasibility of using activity-based costing (ABC) in hospital environments. However, many of these studies discuss the general applications of ABC in health-care organizations. This research explores the potential application of ABC to the nuclear medicine unit (NMU) at a teaching hospital. The finding indicates that the current cost averages 236.11 US dollars for all procedures, which is quite different from the costs computed by using ABC. The difference is most significant with positron emission tomography scan, 463 US dollars (an increase of 96%), as well as bone scan and thyroid scan, 114 US dollars (a decrease of 52%). The result of ABC analysis demonstrates that the operational time (machine time and direct labour time) and the cost of drugs have the most influence on cost per procedure. Clearly, to reduce the cost per procedure for the NMU, the reduction in operational time and cost of drugs should be analysed. The result also indicates that ABC can be used to improve resource allocation and management. It can be an important aid in making management decisions, particularly for improving pricing practices by making costing more accurate. It also facilitates the identification of underutilized resources and related costs, leading to cost reduction. The ABC system will also help hospitals control costs, improve the quality and efficiency of the care they provide, and manage their resources better.

  7. [Research activities in applied mathematics, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  8. Cladistic analysis applied to the classification of volcanoes

    NASA Astrophysics Data System (ADS)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  9. Neutron activation analysis system

    DOEpatents

    Taylor, M.C.; Rhodes, J.R.

    1973-12-25

    A neutron activation analysis system for monitoring a generally fluid media, such as slurries, solutions, and fluidized powders, including two separate conduit loops for circulating fluid samples within the range of radiation sources and detectors is described. Associated with the first loop is a neutron source that emits s high flux of slow and thermal neutrons. The second loop employs a fast neutron source, the flux from which is substantially free of thermal neutrons. Adjacent to both loops are gamma counters for spectrographic determination of the fluid constituents. Other gsmma sources and detectors are arranged across a portion of each loop for deterMining the fluid density. (Official Gazette)

  10. Compatibility of person-centered planning and applied behavior analysis

    PubMed Central

    Holburn, Steve

    2001-01-01

    In response to Osborne (1999), the aims and practices of person-centered planning (PCP) are compared to the basic principles of applied behavior analysis set forth by Baer, Wolf, and Risley (1968, 1987). The principal goal of PCP is social integration of people with disabilities; it qualifies as a socially important behavior, and its problems have been displayed sufficiently. However, social integration is a complex social problem whose solution requires access to system contingencies that influence lifestyles. Nearly all of the component goals of PCP proposed by O'Brien (1987b) have been reliably quantified, although concurrent measurement of outcomes such as friendship, autonomy, and respect presents a formidable challenge. Behavioral principles such as contingency and contextual control are operative within PCP, but problems in achieving reliable implementation appear to impede an experimental analysis. PMID:22478371

  11. Empirical modal decomposition applied to cardiac signals analysis

    NASA Astrophysics Data System (ADS)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  12. Automated SEM Modal Analysis Applied to the Diogenites

    NASA Technical Reports Server (NTRS)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  13. Applying activity-based costing in long-term care.

    PubMed

    Wodchis, W P

    1998-01-01

    As greater numbers of the elderly use health services, and as health care costs climb, effective financial tracking is essential. Cost management in health care can benefit if costs are linked to the care activities where they are incurred. Activity-based costing (ABC) provides a useful approach. The framework aligns costs (inputs), through activities (process), to outputs and outcomes. It allocates costs based on client care needs rather than management structure. The ABC framework was tested in a residential care facility and in supportive housing apartments. The results demonstrate the feasibility and advantages of ABC for long term care agencies, including community-based care. PMID:10339203

  14. Applying activity-based costing in long-term care.

    PubMed

    Wodchis, W P

    1998-01-01

    As greater numbers of the elderly use health services, and as health care costs climb, effective financial tracking is essential. Cost management in health care can benefit if costs are linked to the care activities where they are incurred. Activity-based costing (ABC) provides a useful approach. The framework aligns costs (inputs), through activities (process), to outputs and outcomes. It allocates costs based on client care needs rather than management structure. The ABC framework was tested in a residential care facility and in supportive housing apartments. The results demonstrate the feasibility and advantages of ABC for long term care agencies, including community-based care.

  15. Physical Activity and Wellness: Applied Learning through Collaboration

    ERIC Educational Resources Information Center

    Long, Lynn Hunt; Franzidis, Alexia

    2015-01-01

    This article describes how two university professors teamed up to initiate a university-sponsored physical activity and wellness expo in an effort to promote an authentic and transformative learning experience for preservice students.

  16. Applying activity-based costing to healthcare settings.

    PubMed

    Canby, J B

    1995-02-01

    Activity-based costing (ABC) focuses on processes that drive cost. By tracing healthcare activities back to events that generate cost, a more accurate measurement of financial performance is possible. This article uses ABC principles and techniques to determine costs associated with the x-ray process in a midsized outpatient clinic. The article also provides several tips for initiating an ABC cost system for an entire healthcare organization. PMID:10146178

  17. Applying Transtheoretical Model to Promote Physical Activities Among Women

    PubMed Central

    Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

    2015-01-01

    Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheoretical model in promoting the physical activities among women of Isfahan. Materials and Methods: This research was a quasi-experimental study which was conducted on 141 women residing in Isfahan, Iran. They were randomly divided into case and control groups. In addition to the demographic information, their physical activities and the constructs of the transtheoretical model (stages of change, processes of change, decisional balance, and self-efficacy) were measured at 3 time points; preintervention, 3 months, and 6 months after intervention. Finally, the obtained data were analyzed through t test and repeated measures ANOVA test using SPSS version 16. Results: The results showed that education based on the transtheoretical model significantly increased physical activities in 2 aspects of intensive physical activities and walking, in the case group over the time. Also, a high percentage of people have shown progress during the stages of change, the mean of the constructs of processes of change, as well as pros and cons. On the whole, a significant difference was observed over the time in the case group (P < 0.01). Conclusions: This study showed that interventions based on the transtheoretical model can promote the physical activity behavior among women. PMID:26834796

  18. Applying activity-based costing to healthcare settings.

    PubMed

    Canby, J B

    1995-02-01

    Activity-based costing (ABC) focuses on processes that drive cost. By tracing healthcare activities back to events that generate cost, a more accurate measurement of financial performance is possible. This article uses ABC principles and techniques to determine costs associated with the x-ray process in a midsized outpatient clinic. The article also provides several tips for initiating an ABC cost system for an entire healthcare organization.

  19. Exploring the potential of applying proteomics for tracking bisphenol A and nonylphenol degradation in activated sludge.

    PubMed

    Collado, Neus; Buttiglieri, Gianluigi; Kolvenbach, Boris A; Comas, Joaquim; Corvini, Philippe F-X; Rodríguez-Roda, Ignasi

    2013-02-01

    A significant percentage of bisphenol A and nonylphenol removal in municipal wastewater treatment plants relies on biodegradation. Nonetheless, incomplete information is available concerning their degradation pathways performed by microbial communities in activated sludge systems. Hydroquinone dioxygenase (HQDO) is a specific degradation marker enzyme, involved in bisphenol A and nonylphenol biodegradation, and it can be produced by axenic cultures of the bacterium Sphingomonas sp. strain TTNP3. Proteomics, a technique based on the analysis of microbial community proteins, was applied to this strain. The bacterium proteome map was obtained and a HQDO subunit was successfully identified. Additionally, the reliability of the applied proteomics protocol was evaluated in activated sludge samples. Proteins belonging to Sphingomonas were searched at decreasing biomass ratios, i.e. serially diluting the bacterium in activated sludge. The protein patterns were compared and Sphingomonas proteins were discriminated against the ones from sludge itself on 2D-gels. The detection limit of the applied protocol was defined as 10(-3) g TTNP3 g(-1) total suspended solids (TSSs). The results proved that proteomics can be a promising methodology to assess the presence of specific enzymes in activated sludge samples, however improvements of its sensitivity are still needed.

  20. Applying cluster analysis to physics education research data

    NASA Astrophysics Data System (ADS)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  1. 5 CFR 875.206 - As a new active workforce member, when may I apply?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false As a new active workforce member, when....206 As a new active workforce member, when may I apply? (a) As a new, newly eligible, or returning active workforce member, you may apply as follows: (1) If you are a new active workforce member...

  2. 5 CFR 875.206 - As a new active workforce member, when may I apply?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false As a new active workforce member, when....206 As a new active workforce member, when may I apply? (a) As a new, newly eligible, or returning active workforce member, you may apply as follows: (1) If you are a new active workforce member...

  3. 5 CFR 875.206 - As a new active workforce member, when may I apply?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false As a new active workforce member, when....206 As a new active workforce member, when may I apply? (a) As a new, newly eligible, or returning active workforce member, you may apply as follows: (1) If you are a new active workforce member...

  4. 5 CFR 875.206 - As a new active workforce member, when may I apply?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false As a new active workforce member, when....206 As a new active workforce member, when may I apply? (a) As a new, newly eligible, or returning active workforce member, you may apply as follows: (1) If you are a new active workforce member...

  5. "Dear Ann ..." An Activity for Synthesizing and Applying Interpersonal Concepts

    ERIC Educational Resources Information Center

    Brule, Nancy J.

    2007-01-01

    All students struggle with interpersonal based problems, be it a troublesome roommate, problems with a partner, conflict with a significant other, or relational issues with parents or children. The interpersonal communication classroom can be enhanced by discussing these problems and experiences. This article presents an activity that aims to (1)…

  6. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  7. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    PubMed

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD. PMID:26373767

  8. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  9. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    PubMed Central

    da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

  10. Robust regression applied to fractal/multifractal analysis.

    NASA Astrophysics Data System (ADS)

    Portilla, F.; Valencia, J. L.; Tarquis, A. M.; Saa-Requejo, A.

    2012-04-01

    Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn't be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don't have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: • Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R p-value. In this way we consider the implications of reducing the number of points. • Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology. Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no

  11. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  12. Applying Activity Based Costing (ABC) Method to Calculate Cost Price in Hospital and Remedy Services

    PubMed Central

    Rajabi, A; Dabiri, A

    2012-01-01

    Background Activity Based Costing (ABC) is one of the new methods began appearing as a costing methodology in the 1990’s. It calculates cost price by determining the usage of resources. In this study, ABC method was used for calculating cost price of remedial services in hospitals. Methods: To apply ABC method, Shahid Faghihi Hospital was selected. First, hospital units were divided into three main departments: administrative, diagnostic, and hospitalized. Second, activity centers were defined by the activity analysis method. Third, costs of administrative activity centers were allocated into diagnostic and operational departments based on the cost driver. Finally, with regard to the usage of cost objectives from services of activity centers, the cost price of medical services was calculated. Results: The cost price from ABC method significantly differs from tariff method. In addition, high amount of indirect costs in the hospital indicates that capacities of resources are not used properly. Conclusion: Cost price of remedial services with tariff method is not properly calculated when compared with ABC method. ABC calculates cost price by applying suitable mechanisms but tariff method is based on the fixed price. In addition, ABC represents useful information about the amount and combination of cost price services. PMID:23113171

  13. Applying precision medicine to the active surveillance of prostate cancer

    PubMed Central

    Reichard, Chad A.; Stephenson, Andrew J.

    2015-01-01

    The recent introduction of a variety of molecular tests will potentially reshape the care of patients with prostate cancer. These tests may make more accurate management decisions possible for those patients who have been “overdiagnosed” with biologically indolent disease, which represents an exceptionally small mortality risk. There is a wide range of possible applications of these tests to different clinical scenarios in patient populations managed with active surveillance. Cancer 2015;121:3435–43. © 2015 American Cancer Society. PMID:26149066

  14. Estimating an Applying Uncertainties in Probabilistic Tsunami Hazard Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.

    2013-12-01

    An integral part of the a probabilistic analysis is the formal inclusion of uncertainties, both due to a limited understanding of the physics processes (epistemic) as well their natural variability (aleatory). Because of the strong non-linearity of the tsunami inundation process, it is also important to not only understand the extent of the uncertainties, but also how and where to apply them. We can divide up the uncertainties into several stages: the source, ocean propagation and nearshore/inundation. On the source side, many of the uncertainties are identical to those used in probabilistic seismic hazard analysis (PSHA). However, the details of slip distributions are very significant in tsunami excitation, especially for near-field tsunamis.. We will show several ways of including slip variability, both stochastic and non-stochastic, by developing a probabilistic set of source scenarios. The uncertainties in ocean propagation are less significant since modern algorithms are very successful in modeling open ocean tsunami propagation. However, in the near-shore regime and the inundation, the situation is much more complex. Here, errors in the local elevation models, variability in bottom friction and the omission of built environment can lead to significant errors. Details of the implementation of the tsunami algorithms can yield different results. We will discuss the most significant sources of uncertainty and the alternative ways to implement them using examples for the probabilistic tsunami hazard mapping that we are currently carrying out for the state of California and other regions.

  15. A Hygrothermal Risk Analysis Applied to Residential Unvented Attics

    SciTech Connect

    Pallin, Simon B; Kehrer, Manfred

    2013-01-01

    Aresidential building, constructed with an unvented attic, is acommonroof assembly in the United States.The expected hygrothermal performance and service life of the roof are difficult to estimate due to a number of varying parameters.Typical parameters expected to vary are the climate, direction, and slope of the roof as well as the radiation properties of the surface material. Furthermore, influential parameters are indoor moisture excess, air leakages through the attic floor, and leakages from air-handling unit and ventilation ducts. In addition, the type of building materials such as the insulation material and closed or open cell spray polyurethane foam will influence the future performance of the roof. A development of a simulation model of the roof assembly will enable a risk and sensitivity analysis, in which the most important varying parameters on the hygrothermal performance can be determined. The model is designed to perform probabilistic simulations using mathematical and hygrothermal calculation tools. The varying input parameters can be chosen from existing measurements, simulations, or standards. An analysis is applied to determine the risk of consequences, such as mold growth, rot, or energy demand of the HVAC unit. Furthermore, the future performance of the roof can be simulated in different climates to facilitate the design of an efficient and reliable roof construction with the most suitable technical solution and to determine the most appropriate building materials for a given climate

  16. Inducing in situ, nonlinear soil response applying an active source

    USGS Publications Warehouse

    Johnson, P.A.; Bodin, P.; Gomberg, J.; Pearce, F.; Lawrence, Z.; Menq, F.-Y.

    2009-01-01

    [1] It is well known that soil sites have a profound effect on ground motion during large earthquakes. The complex structure of soil deposits and the highly nonlinear constitutive behavior of soils largely control nonlinear site response at soil sites. Measurements of nonlinear soil response under natural conditions are critical to advancing our understanding of soil behavior during earthquakes. Many factors limit the use of earthquake observations to estimate nonlinear site response such that quantitative characterization of nonlinear behavior relies almost exclusively on laboratory experiments and modeling of wave propagation. Here we introduce a new method for in situ characterization of the nonlinear behavior of a natural soil formation using measurements obtained immediately adjacent to a large vibrator source. To our knowledge, we are the first group to propose and test such an approach. Employing a large, surface vibrator as a source, we measure the nonlinear behavior of the soil by incrementally increasing the source amplitude over a range of frequencies and monitoring changes in the output spectra. We apply a homodyne algorithm for measuring spectral amplitudes, which provides robust signal-to-noise ratios at the frequencies of interest. Spectral ratios are computed between the receivers and the source as well as receiver pairs located in an array adjacent to the source, providing the means to separate source and near-source nonlinearity from pervasive nonlinearity in the soil column. We find clear evidence of nonlinearity in significant decreases in the frequency of peak spectral ratios, corresponding to material softening with amplitude, observed across the array as the source amplitude is increased. The observed peak shifts are consistent with laboratory measurements of soil nonlinearity. Our results provide constraints for future numerical modeling studies of strong ground motion during earthquakes.

  17. 43 CFR 17.302 - To what programs or activities do these regulations apply?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... regulations apply to each DOI recipient and to each program or activity operated by the recipient which receives Federal financial assistance provided by DOI. (b) The Act and these regulations do not apply...

  18. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    NASA Astrophysics Data System (ADS)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  19. First Attempt of Applying Factor Analysis in Moving Base Gravimetry

    NASA Astrophysics Data System (ADS)

    Li, X.; Roman, D. R.

    2014-12-01

    For gravimetric observation systems on mobile platforms (land/sea/airborne), the Low Signal to Noise Ratio (SNR) issue is the main barrier to achieving an accurate, high resolution gravity signal. Normally, low-pass filters (Childers et al 1999, Forsberg et al 2000, Kwon and Jekeli 2000, Hwang et al 2006) are applied to smooth or remove the high frequency "noise" - even though some of the high frequency component is not necessarily noise. This is especially true for aerogravity surveys such as those from the Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project. These gravity survey flights have a spatial resolution of 10 km between tracks but higher resolution along track. The along track resolution is improved due to the lower flight height (6.1 km), equipment sensitivity, and improved modeling of potential errors. Additionally, these surveys suffer from a loss of signal power due to the increased flight elevation. Hence, application of a low-pass filter removes possible signal sensed in the along-track direction that might otherwise prove useful for various geophysical and geodetic applications. Some cutting-edge developments in Wavelets and Artificial Neural Networks had been successfully applied for obtaining improved results (Li 2008 and 2011, Liang and Liu 2013). However, a clearer and fundamental understanding of the error characteristics will further improve the quality of the gravity estimates out of these gravimetric systems. Here, instead of using any predefined basis function or any a priori model, the idea of Factor Analysis is first employed to try to extract the underlying factors of the noises in the systems. Real data sets collected by both land vehicle and aircraft will be processed as the examples.

  20. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  1. Sensitivity analysis for texture models applied to rust steel classification

    NASA Astrophysics Data System (ADS)

    Trujillo, Maite; Sadki, Mustapha

    2004-05-01

    The exposure of metallic structures to rust degradation during their operational life is a known problem and it affects storage tanks, steel bridges, ships, etc. In order to prevent this degradation and the potential related catastrophes, the surfaces have to be assessed and the appropriate surface treatment and coating need to be applied according to the corrosion time of the steel. We previously investigated the potential of image processing techniques to tackle this problem. Several mathematical algorithms methods were analyzed and evaluated on a database of 500 images. In this paper, we extend our previous research and provide a further analysis of the textural mathematical methods for automatic rust time steel detection. Statistical descriptors are provided to evaluate the sensitivity of the results as well as the advantages and limitations of the different methods. Finally, a selector of the classifiers algorithms is introduced and the ratio between sensitivity of the results and time response (execution time) is analyzed to compromise good classification results (high sensitivity) and acceptable time response for the automation of the system.

  2. Factor Analysis Applied the VFY-218 RCS Data

    NASA Technical Reports Server (NTRS)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  3. 49 CFR Appendix A to Part 21 - Activities to Which This Part Applies

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false Activities to Which This Part Applies A Appendix A... 1964 Pt. 21, App. A Appendix A to Part 21—Activities to Which This Part Applies 1. Use of grants made.... 141(a)). 6. Use of Coast Guard personnel for duty in connection with maritime instruction and...

  4. Ion Beam Analysis applied to laser-generated plasmas

    NASA Astrophysics Data System (ADS)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  5. Phase plane analysis: applying chaos theory in health care.

    PubMed

    Priesmeyer, H R; Sharp, L F

    1995-01-01

    This article applies the new science of nonlinearity to administrative issues and accounts receivable management in health care, and it provides a new perspective on common operating and quality control measures. PMID:10151628

  6. Multi-Criteria Analysis for Biomass Utilization Applying Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Morimoto, Hidetsugu; Hoshino, Satoshi; Kuki, Yasuaki

    This paper aimed to consider about material-recycling, preventing global warming, and economic efficiency on preset and planed 195 Biomass Towns applying DEA (Data Envelopment Analysis), which can evaluate operational efficiency entities such as private companies or projects. In the results, although the Biomass Town can recycle material efficiently, it was clarified that preventing global warming and business profitability was brushed off like it in Biomass Town Design. Moreover, from the point of view of operational efficiency, we suggested an improvement of the Biomass Town scale for more efficiency-enhancing applying DEA. We found that applying DEA was able to catch more improvements or indicator as compared with cost-benefit analysis and cost-effectiveness analysis.

  7. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  8. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  9. Duration Analysis Applied to the Adoption of Knowledge.

    ERIC Educational Resources Information Center

    Vega-Cervera, Juan A.; Gordillo, Isabel Cuadrado

    2001-01-01

    Analyzes knowledge acquisition in a sample of 264 pupils in 9 Spanish elementary schools, using time as a dependent variable. Introduces psycho-pedagogical, pedagogical, and social variables into a hazard model applied to the reading process. Auditory discrimination (not intelligence or visual perception) most significantly influences learning to…

  10. Some Applied Research Concerns Using Multiple Linear Regression Analysis.

    ERIC Educational Resources Information Center

    Newman, Isadore; Fraas, John W.

    The intention of this paper is to provide an overall reference on how a researcher can apply multiple linear regression in order to utilize the advantages that it has to offer. The advantages and some concerns expressed about the technique are examined. A number of practical ways by which researchers can deal with such concerns as…

  11. Maximum likelihood estimation applied to multiepoch MEG/EEG analysis

    NASA Astrophysics Data System (ADS)

    Baryshnikov, Boris V.

    A maximum likelihood based algorithm for reducing the effects of spatially colored noise in evoked response MEG and EEG experiments is presented. The signal of interest is modeled as the low rank mean, while the noise is modeled as a Kronecker product of spatial and temporal covariance matrices. The temporal covariance is assumed known, while the spatial covariance is estimated as part of the algorithm. In contrast to prestimulus based whitening followed by principal component analysis, our algorithm does not require signal-free data for noise whitening and thus is more effective with non-stationary noise and produces better quality whitening for a given data record length. The efficacy of this approach is demonstrated using simulated and real MEG data. Next, a study in which we characterize MEG cortical response to coherent vs. incoherent motion is presented. It was found that coherent motion of the object induces not only an early sensory response around 180 ms relative to the stimulus onset but also a late field in the 250--500 ms range that has not been observed previously in similar random dot kinematogram experiments. The late field could not be resolved without signal processing using the maximum likelihood algorithm. The late activity localized to parietal areas. This is what would be expected. We believe that the late field corresponds to higher order processing related to the recognition of the moving object against the background. Finally, a maximum likelihood based dipole fitting algorithm is presented. It is suitable for dipole fitting of evoked response MEG data in the presence of spatially colored noise. The method exploits the temporal multiepoch structure of the evoked response data to estimate the spatial noise covariance matrix from the section of data being fit, eliminating the stationarity assumption implicit in prestimulus based whitening approaches. The preliminary results of the application of this algorithm to the simulated data show its

  12. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    ERIC Educational Resources Information Center

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  13. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  14. 26 CFR 601.527 - Other provisions applied to representation in alcohol, tobacco, and firearms activities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... alcohol, tobacco, and firearms activities. 601.527 Section 601.527 Internal Revenue INTERNAL REVENUE... Conference and Practice Requirements Requirements for Alcohol, Tobacco, and Firearms Activities § 601.527 Other provisions applied to representation in alcohol, tobacco, and firearms activities. The...

  15. 26 CFR 601.527 - Other provisions applied to representation in alcohol, tobacco, and firearms activities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... alcohol, tobacco, and firearms activities. 601.527 Section 601.527 Internal Revenue INTERNAL REVENUE... Conference and Practice Requirements Requirements for Alcohol, Tobacco, and Firearms Activities § 601.527 Other provisions applied to representation in alcohol, tobacco, and firearms activities. The...

  16. 26 CFR 601.527 - Other provisions applied to representation in alcohol, tobacco, and firearms activities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... alcohol, tobacco, and firearms activities. 601.527 Section 601.527 Internal Revenue INTERNAL REVENUE... Conference and Practice Requirements Requirements for Alcohol, Tobacco, and Firearms Activities § 601.527 Other provisions applied to representation in alcohol, tobacco, and firearms activities. The...

  17. 26 CFR 601.527 - Other provisions applied to representation in alcohol, tobacco, and firearms activities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... alcohol, tobacco, and firearms activities. 601.527 Section 601.527 Internal Revenue INTERNAL REVENUE... Conference and Practice Requirements Requirements for Alcohol, Tobacco, and Firearms Activities § 601.527 Other provisions applied to representation in alcohol, tobacco, and firearms activities. The...

  18. 26 CFR 601.527 - Other provisions applied to representation in alcohol, tobacco, and firearms activities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... alcohol, tobacco, and firearms activities. 601.527 Section 601.527 Internal Revenue INTERNAL REVENUE... Conference and Practice Requirements Requirements for Alcohol, Tobacco, and Firearms Activities § 601.527 Other provisions applied to representation in alcohol, tobacco, and firearms activities. The...

  19. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  20. Radioactivity analysis in niobium activation foils

    SciTech Connect

    Mueller, G.E.

    1995-06-01

    The motivation for this study was to measure and analyze the activity of six (6) niobium (Nb) foils (the x-rays from an internal transition in Nb-93m) and apply this information with previously obtained activation foil data. The niobium data was used to determine the epithermal to MeV range for the neutron spectrum and fluence. The foil activation data was re-evaluated in a spectrum analysis code (STAY`SL) to provide new estimates of the exposure at the Los Alamos Spallation Radiation Effect Facility (LASREF). The activity of the niobium foils was measured and analyzed at the University of Missouri-Columbia (UMC) under the direction of Professor William Miller. The spectrum analysis was performed at the University of Missouri-Rolla (UMR) by Professor Gary Mueller.

  1. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  2. 6 CFR Appendix A to Part 21 - Activities to Which This Part Applies

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...—Activities to Which This Part Applies Note: Failure to list a type of Federal assistance in appendix A shall... Fund. 8. U.S. Coast Guard Bridge Alteration. 9. Use of Customs personnel and facilities by any...

  3. Factorial kriging analysis applied to geological data from petroleum exploration

    SciTech Connect

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  4. Spectrophotometric multicomponent analysis applied to trace metal determinations

    SciTech Connect

    Otto, M.; Wegscheider, W.

    1985-01-01

    Quantitative spectrometric analysis of mixture components is featured for systems with low spectral selectivity, namely, in the ultraviolet, visible, and infrared spectral range. Limitations imposed by data reduction schemes based on ordinary multiple regression are shown to be overcome by means of partial least-squares analysis in latent variables. The influences of variables such as noise, band separation band intensity ratios, number of wavelengths, number of components, number of calibration mixtures, time drift, or deviations from Beer's law on the analytical result has been evaluated under a wide range of conditions providing a basis to search for new systems applicable to spectrophotometric multicomponent analysis. The practical utility of the method is demonstrated for simultaneous analysis of copper, nickel, cobalt, iron, and palladium down to 2 X 10/sup -6/ M concentrations by use of their diethyldithiocarbamate chelate complexes with relative errors less than 6%. 26 references, 4 figures, 6 tables.

  5. 13 CFR 101.402 - What procedures apply to the selection of SBA programs and activities?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What procedures apply to the selection of SBA programs and activities? 101.402 Section 101.402 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION ADMINISTRATION Intergovernmental Partnership § 101.402 What procedures apply to...

  6. Activation analysis using Cornell TRIGA

    SciTech Connect

    Hossain, Tim Z.

    1994-07-01

    A major use of the Cornell TRIGA is for activation analysis. Over the years many varieties of samples have been analyzed from a number of fields of interest ranging from geology, archaeology and textiles. More recently the analysis has been extended to high technology materials for applications in optical and semiconductor devices. Trace analysis in high purity materials like Si wafers has been the focus in many instances, while in others analysis of major/minor components were the goals. These analysis has been done using the delayed mode. Results from recent measurements in semiconductors and other materials will be presented. In addition the near future capability of using prompt gamma activation analysis using the Cornell cold neutron beam will be discussed. (author)

  7. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    SciTech Connect

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  8. The colour analysis method applied to homogeneous rocks

    NASA Astrophysics Data System (ADS)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  9. Orbit Response Matrix Analysis Applied at PEP-II

    SciTech Connect

    Steier, C.; Wolski, A.; Ecklund, S.; Safranek, J.A.; Tenenbaum, P.; Terebilo, A.; Turner, J.L.; Yocky, G.; /SLAC

    2005-05-17

    The analysis of orbit response matrices has been used very successfully to measure and correct the gradient and skew gradient distribution in many accelerators. It allows determination of an accurately calibrated model of the coupled machine lattice, which then can be used to calculate the corrections necessary to improve coupling, dynamic aperture and ultimately luminosity. At PEP-II, the Matlab version of LOCO has been used to analyze coupled response matrices for both the LER and the HER. The large number of elements in PEP-II and the very complicated interaction region present unique challenges to the data analysis. All necessary tools to make the analysis method useable at PEP-II have been implemented and LOCO can now be used as a routine tool for lattice diagnostic.

  10. Joint regression analysis and AMMI model applied to oat improvement

    NASA Astrophysics Data System (ADS)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  11. On the relation between applied behavior analysis and positive behavioral support

    PubMed Central

    Carr, James E.; Sidener, Tina M.

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis. PMID:22478389

  12. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  13. Applying Motivational Analysis in a Web-Based Course

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan

    2009-01-01

    An important facet of effective Web-based instructional design is the consideration of learning activities to stimulate students' learning motivation. In order to create a motivating interaction environment, the design of motivational strategies to foster student interest in learning is essential. The study employed Keller's ARCS Motivational…

  14. Soil Studies: Applying Acid-Base Chemistry to Environmental Analysis.

    ERIC Educational Resources Information Center

    West, Donna M.; Sterling, Donna R.

    2001-01-01

    Laboratory activities for chemistry students focus attention on the use of acid-base chemistry to examine environmental conditions. After using standard laboratory procedures to analyze soil and rainwater samples, students use web-based resources to interpret their findings. Uses CBL probes and graphing calculators to gather and analyze data and…

  15. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    ERIC Educational Resources Information Center

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  16. Action, Content and Identity in Applied Genre Analysis for ESP

    ERIC Educational Resources Information Center

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  17. Applying Adult Learning Theory through a Character Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  18. Applying Score Analysis to a Rehearsal Pedagogy of Expressive Performance

    ERIC Educational Resources Information Center

    Byo, James L.

    2014-01-01

    The discoveries of score analysis (e.g., minor seventh chord, ostinato, phrase elision, melodic fragment, half cadence) are more than just compositional techniques or music vocabulary. They are sounds--fascinating, storytelling, dynamic modes of expression--that when approached as such enrich the rehearsal experience. This article presents a…

  19. A value analysis model applied to the management of amblyopia.

    PubMed Central

    Beauchamp, G R; Bane, M C; Stager, D R; Berry, P M; Wright, W W

    1999-01-01

    PURPOSE: To assess the value of amblyopia-related services by utilizing a health value model (HVM). Cost and quality criteria are evaluated in accordance with the interests of patients, physicians, and purchasers. METHODS: We applied an HVM to a hypothetical statistical ("median") child with amblyopia whose visual acuity is 20/80 and to a group of children with amblyopia who are managed by our practice. We applied the model to calculate the value of these services by evaluating the responses of patients and physicians and relating these responses to clinical outcomes. RESULTS: The consensus value of care for the hypothetical median child was calculated to be 0.406 (of 1.000). For those children managed in our practice, the calculated value is 0.682. Clinically, 79% achieved 20/40 or better visual acuity, and the mean final visual acuity was 0.2 logMAR (20/32). Value appraisals revealed significant concerns about the financial aspects of amblyopia-related services, particularly among physicians. Patients rated services more positively than did physicians. CONCLUSIONS: Amblyopia care is difficult, sustained, and important work that requires substantial sensitivity to and support of children and families. Compliance and early detection are essential to success. The value of amblyopia services is rated significantly higher by patients than by physicians. Relative to the measured value, amblyopia care is undercompensated. The HVM is useful to appraise clinical service delivery and its variation. The costs of failure and the benefits of success are high; high-value amblyopia care yields substantial dividends and should be commensurately compensated in the marketplace. PMID:10703133

  20. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  1. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  2. Analysis of Facial Aesthetics as Applied to Injectables.

    PubMed

    Lam, Samuel M; Glasgold, Robert; Glasgold, Mark

    2015-11-01

    Understanding the role of volume loss in the aging face has resulted in a paradigm shift in facial rejuvenation techniques. Injectable materials for volume restoration are among the most widespread cosmetic procedures performed. A new approach to the aesthetics of facial aging is necessary to allow the greatest improvement from volumetric techniques while maintaining natural appearing results. Examining the face in terms of facial frames and facial shadows provides the fundamental basis for our injectable analysis.

  3. Applying hydraulic transient analysis: The Grizzly Hydro Project

    SciTech Connect

    Logan, T.H.; Stutsman, R.D. )

    1992-04-01

    No matter the size of the hydro plant, if it has a long waterway and will operate in peaking mode, the project designer needs to address the issue of hydraulic transients-known as water hammer-early in the design. This article describes the application of transient analysis to the design of a 20-MW hydro plant in California. In this case, a Howell Bunger valve was used as a pressure regulating valve to control transient pressures and speed rise.

  4. 14 CFR 1252.102 - To what programs or activities do these regulations apply?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false To what programs or activities do these regulations apply? 1252.102 Section 1252.102 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION NONDISCRIMINATION ON THE BASIS OF AGE IN PROGRAMS OR ACTIVITIES RECEIVING FEDERAL...

  5. 14 CFR § 1252.102 - To what programs or activities do these regulations apply?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 5 2014-01-01 2014-01-01 false To what programs or activities do these regulations apply? § 1252.102 Section § 1252.102 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION NONDISCRIMINATION ON THE BASIS OF AGE IN PROGRAMS OR ACTIVITIES RECEIVING FEDERAL...

  6. 14 CFR 1252.102 - To what programs or activities do these regulations apply?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false To what programs or activities do these regulations apply? 1252.102 Section 1252.102 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION NONDISCRIMINATION ON THE BASIS OF AGE IN PROGRAMS OR ACTIVITIES RECEIVING FEDERAL...

  7. Applying temporal network analysis to the venture capital market

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  8. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  9. Fundamental and Applied Investigations in Atomic Spectrometric Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Min

    Simultaneous laser-excited fluorescence and absorption measurements were performed and the results have revealed that any interference caused by easily ionized elements does not originate from variations in analyte emission (quantum) efficiency. A closely related area, the roles of wet and dry aerosols in the matrix interference are clarified through spatially resolved imaging of the plasma by a charged coupled device camera. To eliminate matrix interference effects practically, various methods have been developed based on the above studies. The use of column pre-concentration with flow injection analysis has been found to provide a simple solution for reducing interference effects and increasing sensitivity of elemental analysis. A novel mini-spray chamber was invented. The new vertical rotary spray chamber combines gravitational, centrifugal, turbulent, and impact droplet segregation mechanisms to achieve a higher efficiency of small-droplet formation in a nebulized sample spray. As a result, it offers also higher sample-transport efficiency, lower memory effects, and improved analytical figures of merit over existing devices. This new device was employed with flow injection analysis to simulate an interface for coupling high performance liquid chromatography (HPLC) to a microwave plasma for chromatographic detection. The detection limits for common metallic elements are in the range of 5-50 mug/mL, and are degraded only twofold when the elements are presented in an organic solvent such as ethanol or methanol. Other sample-introduction schemes have also been investigated to improve sample-introduction technology. The direct coupling of hydride-generation techniques to the helium microwave plasma torch was evaluated for the determination of arsenic, antimony and tin by atomic emission spectrometry. A manually controlled peristaltic pump was modified for computer control and continuous flow injection was evaluated for standard calibration and trace elemental

  10. Operational modal analysis applied to the concert harp

    NASA Astrophysics Data System (ADS)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  11. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes. PMID:22672933

  12. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  13. Dynamical systems analysis applied to working memory data.

    PubMed

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  14. Dynamical systems analysis applied to working memory data

    PubMed Central

    Gasimova, Fidan; Robitzsch, Alexander; Wilhelm, Oliver; Boker, Steven M.; Hu, Yueqin; Hülür, Gizem

    2014-01-01

    In the present paper we investigate weekly fluctuations in the working memory capacity (WMC) assessed over a period of 2 years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure's performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions. PMID:25071657

  15. Unsupervised feature relevance analysis applied to improve ECG heartbeat clustering.

    PubMed

    Rodríguez-Sotelo, J L; Peluffo-Ordoñez, D; Cuesta-Frau, D; Castellanos-Domínguez, G

    2012-10-01

    The computer-assisted analysis of biomedical records has become an essential tool in clinical settings. However, current devices provide a growing amount of data that often exceeds the processing capacity of normal computers. As this amount of information rises, new demands for more efficient data extracting methods appear. This paper addresses the task of data mining in physiological records using a feature selection scheme. An unsupervised method based on relevance analysis is described. This scheme uses a least-squares optimization of the input feature matrix in a single iteration. The output of the algorithm is a feature weighting vector. The performance of the method was assessed using a heartbeat clustering test on real ECG records. The quantitative cluster validity measures yielded a correctly classified heartbeat rate of 98.69% (specificity), 85.88% (sensitivity) and 95.04% (general clustering performance), which is even higher than the performance achieved by other similar ECG clustering studies. The number of features was reduced on average from 100 to 18, and the temporal cost was a 43% lower than in previous ECG clustering schemes.

  16. Synchronisation and coupling analysis: applied cardiovascular physics in sleep medicine.

    PubMed

    Wessel, Niels; Riedl, Maik; Kramer, Jan; Muller, Andreas; Penzel, Thomas; Kurths, Jurgen

    2013-01-01

    Sleep is a physiological process with an internal program of a number of well defined sleep stages and intermediate wakefulness periods. The sleep stages modulate the autonomous nervous system and thereby the sleep stages are accompanied by different regulation regimes for the cardiovascular and respiratory system. The differences in regulation can be distinguished by new techniques of cardiovascular physics. The number of patients suffering from sleep disorders increases unproportionally with the increase of the human population and aging, leading to very high expenses in the public health system. Therefore, the challenge of cardiovascular physics is to develop highly-sophisticated methods which are able to, on the one hand, supplement and replace expensive medical devices and, on the other hand, improve the medical diagnostics with decreasing the patient's risk. Methods of cardiovascular physics are used to analyze heart rate, blood pressure and respiration to detect changes of the autonomous nervous system in different diseases. Data driven modeling analysis, synchronization and coupling analysis and their applications to biosignals in healthy subjects and patients with different sleep disorders are presented. Newly derived methods of cardiovascular physics can help to find indicators for these health risks.

  17. Open-channel block by internally applied amines inhibits activation gate closure in batrachotoxin-activated sodium channels.

    PubMed Central

    Zamponi, G W; French, R J

    1994-01-01

    We have studied the action of several pore-blocking amines on voltage-dependent activation gating of batrachotoxin(BTX)-activated sodium channels, from bovine heart and rat skeletal muscle, incorporated into planar lipid bilayers. Although structurally simpler, the compounds studied show general structural features and channel-inhibiting actions that resemble those of lidocaine. When applied to the cytoplasmic end of the channel, these compounds cause a rapid, voltage-dependent, open-channel block seen as a reduction in apparent single-channel amplitude (companion paper). Internal application of phenylpropanolamine, phenylethylamine, phenylmethylamine, and diethylamine, as well as causing open-channel block, reduces the probability of channel closure, producing a shift of the steady-state activation curve toward more hyperpolarizing potentials. These gating effects were observed for both cardiac and skeletal muscle channels and were not evoked by addition of equimolar N-Methyl-D-Glucamine, suggesting a specific interaction of the blockers with the channel rather than a surface charge effect. Kinetic analysis of phenylpropanolamine action on skeletal muscle channels indicated that phenylpropanolamine reduced the closed probability via two separate mechanisms. First, mean closed durations were slightly abbreviated in its presence. Second, and more important, the frequency of the gating closures was reduced. This action was correlated with the degree, and the voltage dependence, of open-channel block, suggesting that the activation gate cannot close while the pore is occluded by the blocker. Such a mechanism might underlie the previously reported immobilization of gating charge associated with local anesthetic block of unmodified sodium channels. PMID:7811914

  18. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  19. PSYCHOANALYSIS AND THE ARTS: THE SLIPPERY GROUND OF APPLIED ANALYSIS.

    PubMed

    Abella, Adela

    2016-01-01

    The ways in which today's psychoanalysts approach art closely follow the avenues opened by Freud a hundred years ago. Drawing mainly on Freud's studies on Jensen's Gradiva (1907) and on Leonardo da Vinci (1910a), the author examines the main paradigms he used in discussing artistic activity, including his doubts and hesitations. Present-day approaches to art are then examined via a discussion of the advantages and pitfalls of psychobiography, of the case study, and of textual approaches. The author makes a case for the type of interdisciplinary dialogue in which the goal is to establish a cross-fertilization between psychoanalysis and other fields of knowledge while striving to avoid hypersaturation of a work of art in order to foster expansion of the mind. PMID:26784716

  20. PSYCHOANALYSIS AND THE ARTS: THE SLIPPERY GROUND OF APPLIED ANALYSIS.

    PubMed

    Abella, Adela

    2016-01-01

    The ways in which today's psychoanalysts approach art closely follow the avenues opened by Freud a hundred years ago. Drawing mainly on Freud's studies on Jensen's Gradiva (1907) and on Leonardo da Vinci (1910a), the author examines the main paradigms he used in discussing artistic activity, including his doubts and hesitations. Present-day approaches to art are then examined via a discussion of the advantages and pitfalls of psychobiography, of the case study, and of textual approaches. The author makes a case for the type of interdisciplinary dialogue in which the goal is to establish a cross-fertilization between psychoanalysis and other fields of knowledge while striving to avoid hypersaturation of a work of art in order to foster expansion of the mind.

  1. Applying Skinner's analysis of verbal behavior to persons with dementia.

    PubMed

    Dixon, Mark; Baker, Jonathan C; Sadowski, Katherine Ann

    2011-03-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may facilitate not only acquisition of language but also the ability to recall items or objects that may have appeared to be "forgotten." The present study examined the utility of having a series of adults in long-term care emit tacts, echoics, or intraverbals upon presentation of various visual stimuli. Compared to a no-verbal response condition, it appears that the incorporation of Skinner's verbal operants can in fact improve recall for this population. Implications for the retraining of lost language are presented. PMID:21292058

  2. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  3. Applying Machine Learning to GlueX Data Analysis

    NASA Astrophysics Data System (ADS)

    Boettcher, Thomas

    2014-03-01

    GlueX is a high energy physics experiment with the goal of collecting data necessary for understanding confinement in quantum chromodynamics. Beginning in 2015, GlueX will collect huge amounts of data describing billions of particle collisions. In preparation for data collection, efforts are underway to develop a methodology for analyzing these large data sets. One of the primary challenges in GlueX data analysis is isolating events of interest from a proportionally large background. GlueX has recently begun approaching this selection problem using machine learning algorithms, specifically boosted decision trees. Preliminary studies indicate that these algorithms have the potential to offer vast improvements in both signal selection efficiency and purity over more traditional techniques.

  4. Naming, the formation of stimulus classes, and applied behavior analysis.

    PubMed Central

    Stromer, R; Mackay, H A; Remington, B

    1996-01-01

    The methods used in Sidman's original studies on equivalence classes provide a framework for analyzing functional verbal behavior. Sidman and others have shown how teaching receptive, name-referent matching may produce rudimentary oral reading and word comprehension skills. Eikeseth and Smith (1992) have extended these findings by showing that children with autism may acquire equivalence classes after learning to supply a common oral name to each stimulus in a potential class. A stimulus class analysis suggests ways to examine (a) the problem of programming generalization from teaching situations to other environments, (b) the expansion of the repertoires that occur in those settings, and (c) the use of naming to facilitate these forms of generalization. Such research will help to clarify and extend Horne and Lowe's recent (1996) account of the role of verbal behavior in the formation of stimulus classes. PMID:8810064

  5. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    PubMed

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (p<0.05) of ZZ (η=920.4) was higher than the ZCA (η=651.1) and similar to the ZMAX (η=983.6) and ZYZ (η=1054.8). By means of quantitative and qualitative fractographic analysis, this study showed fracture toughness and strength that could be correlated to the observable microstructural features of the evaluated zirconia polycrystalline ceramics. PMID:26722988

  6. Changes of the porous structure of activated carbons applied in a filter bed pilot operation.

    PubMed

    Gauden, P A; Szmechtig-Gauden, E; Rychlicki, G; Duber, S; Garbacz, J K; Buczkowski, R

    2006-03-15

    The paper investigates the changes in porosity (i.e., in the accessible adsorption capacity of carbonaceous adsorbents for pollutants during filter bed maturation) of three activated carbons applied in a filter bed pilot operation. The results of this investigation may help to reduce operating costs, increase granular activated carbon bed life, maximize the useful life of biofilters, and understand the mechanism of water purification by carbon adsorbents. The analysis of the pore structure was limited to the first year of service of the beds, since this was when the largest decrease in the available pore capacity occurred. Low-temperature nitrogen adsorption isotherms were used to evaluate the structural parameters and pore size distributions (PSDs) of carbon samples (virgin (reference) and mature adsorbents for different periods of water treatment) on the basis of the Nguyen and Do (ND) method and density functional theory (DFT). These results were compared with small-angle X-ray scattering (SAXS) investigations (PSDs calculated by Glatter's indirect transformation method (ITP)). The results show that in general, the ND and ITP methods lead to almost the same qualitative distribution curve behavior. Moreover, the enthalpy of immersion in water, mercury porosimetry, densities (true and apparent), and the analysis of ash are reported and compared to explain the decrease in adsorptive capacity of the carbons investigated. On the other hand, the efficacy of TOC (total organic carbon, i.e., a quantity describing the complex matrix of organic material present in natural waters) removal and the bacteria count were analyzed to explain the role of adsorption in the elimination of contaminants from water. Finally, a mechanism of organic matter removal was suggested on the basis of the above-mentioned experimental data and compared with mechanisms reported by other authors. PMID:16198363

  7. The impact of applied behavior analysis on diverse areas of research.

    PubMed

    Kazdin, A E

    1975-01-01

    The impact of applied behavior analysis on various disciplines and areas of research was assessed through two major analyses. First, the relationship of applied behavior analysis to the general area of "behavior modification" was evaluated by examining the citation characteristics of journal articles in JABA and three other behavior-modification journals. Second, the penetration of applied behavior analysis into diverse areas and disciplines, including behavior modification, psychiatry, clinical psychology, education, special education, retardation, speech and hearing, counselling, and law enforcement and correction was assessed. Twenty-five journals representing diverse research areas were evaluated from 1968 to 1974 to assess the extent to which operant techniques were applied for therapeutic, rehabilitative, and educative purposes and the degree to which methodological desiderata of applied behavior analysis were met. The analyses revealed diverse publication outlets for applied behavior analysis in various disciplines.

  8. Applying data mining for the analysis of breast cancer data.

    PubMed

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  9. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  10. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  11. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    NASA Astrophysics Data System (ADS)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  12. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  13. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    ERIC Educational Resources Information Center

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  14. Applying the Theory of Planned Behavior to Physical Activity: The Moderating Role of Mental Toughness.

    PubMed

    Hannan, Thomas E; Moffitt, Robyn L; Neumann, David L; Thomas, Patrick R

    2015-10-01

    This study explored whether mental toughness, the capacity to maintain performance under pressure, moderated the relation between physical activity intentions and subsequent behavior. Participants (N = 117) completed the Mental Toughness Index and a theory of planned behavior questionnaire. Seven days later, physical activity was assessed using the International Physical Activity Questionnaire. Attitudes, subjective norms, and perceived behavioral control explained substantial variance (63.1%) in physical activity intentions. Intentions also significantly predicted physical activity behavior. The simple slopes analyses for the moderation effect revealed a nonsignificant intention-behavior relation at low levels of mental toughness. However, intentions were significantly and positively related to physical activity when mental toughness was moderate or high, suggesting that the development of a mentally tough mindset may reduce the gap between behavior and physical activity intention. Future research is needed to confirm these findings and apply them in the design of mental toughness interventions to facilitate physical activity engagement.

  15. Applying the Theory of Planned Behavior to Physical Activity: The Moderating Role of Mental Toughness.

    PubMed

    Hannan, Thomas E; Moffitt, Robyn L; Neumann, David L; Thomas, Patrick R

    2015-10-01

    This study explored whether mental toughness, the capacity to maintain performance under pressure, moderated the relation between physical activity intentions and subsequent behavior. Participants (N = 117) completed the Mental Toughness Index and a theory of planned behavior questionnaire. Seven days later, physical activity was assessed using the International Physical Activity Questionnaire. Attitudes, subjective norms, and perceived behavioral control explained substantial variance (63.1%) in physical activity intentions. Intentions also significantly predicted physical activity behavior. The simple slopes analyses for the moderation effect revealed a nonsignificant intention-behavior relation at low levels of mental toughness. However, intentions were significantly and positively related to physical activity when mental toughness was moderate or high, suggesting that the development of a mentally tough mindset may reduce the gap between behavior and physical activity intention. Future research is needed to confirm these findings and apply them in the design of mental toughness interventions to facilitate physical activity engagement. PMID:26524097

  16. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    ERIC Educational Resources Information Center

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  17. The expression of functional IL-2 receptor on activated macrophages depends on the stimulus applied.

    PubMed Central

    Valitutti, S; Carbone, A; Castellino, F; Maggiano, N; Ricci, R; Larocca, L M; Musiani, P

    1989-01-01

    Human peripheral blood monocytes (Mo) synthesize prostaglandin E2 (PGE2) when activated with lipopolysaccharide (LPS). This production is strongly enhanced by the addition of supernatant from phytohaemagglutinin (PHA)-activated T cells. To evaluate the factor(s) responsible for this enhancement we studied the effect of several cytokines on the PGE2 metabolism. Recombinant interleukin-1 (IL-1) or recombinant IL-2 strongly enhanced PGE2 synthesis in LPS-stimulated Mo cultures, whereas recombinant interferon-gamma (IFN-gamma) partially inhibited its production. To see whether the effect of IL-2 on Mo was due to the presence of IL-2 receptor (IL-2R) on the cell surface, flow cytometric analysis and electron microscopy were used to investigate IL-2R expression in unstimulated and stimulated Mo. Stimulated, but not resting, Mo displayed the p55 IL-2R chain on their cellular surface and associated with the polyribosomes of the rough endoplasmic reticulum in the cytoplasm. This finding strongly suggested that the p55 chain of the IL-2R was synthesized by activated Mo. To confirm this result, 125I-labelled IL-2 was bound under high- and low-affinity conditions and cross-linked to Mo cultured in the presence of LPS, IFN-gamma or IL-1. The cross-linked 125I-IL-2/IL-2R complexes were analysed by SDS-PAGE. Mo cultured with LPS, IFN-gamma and IL-1 expressed the p55 protein detected by low-affinity cross-linking, whereas only LPS-stimulated Mo displayed a barely detectable band with an apparent MW of 70,000 under high-affinity binding conditions. In addition, stimulated Mo were found capable of producing the soluble form of IL-2R. Finally, LPS-activated Mo only responded to the addition of IL-2 by an increase in PGE2 production, suggesting that the function of IL-2R on activated Mo is linked to the stimulus applied. Images Figure 2 Figure 3 PMID:2661416

  18. 24 CFR 1000.242 - When does the requirement for exemption from taxation apply to affordable housing activities?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... exemption from taxation apply to affordable housing activities? 1000.242 Section 1000.242 Housing and Urban... ACTIVITIES Indian Housing Plan (IHP) § 1000.242 When does the requirement for exemption from taxation apply to affordable housing activities? The requirement for exemption from taxation applies only to...

  19. Neutron Activation Analysis: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    MacLellan, Ryan

    2011-04-01

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  20. Neutron Activation Analysis: Techniques and Applications

    SciTech Connect

    MacLellan, Ryan

    2011-04-27

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  1. 50 CFR 22.2 - What activities does this part apply to?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 9 2014-10-01 2014-10-01 false What activities does this part apply to? 22.2 Section 22.2 Wildlife and Fisheries UNITED STATES FISH AND WILDLIFE SERVICE, DEPARTMENT OF THE..., import, export, purchase, sell, trade, barter, or offer for purchase, sale, trade, or barter bald...

  2. Applied Behavior Analysis Is Ideal for the Development of a Land Mine Detection Technology Using Animals

    PubMed Central

    Jones, B. M

    2011-01-01

    The detection and subsequent removal of land mines and unexploded ordnance (UXO) from many developing countries are slow, expensive, and dangerous tasks, but have the potential to improve the well-being of millions of people. Consequently, those involved with humanitarian mine and UXO clearance are actively searching for new and more efficient detection technologies. Remote explosive scent tracing (REST) using trained dogs has the potential to be one such technology. However, details regarding how best to train, test, and deploy dogs in this role have never been made publicly available. This article describes how the key characteristics of applied behavior analysis, as described by Baer, Wolf and Risley (1968, 1987), served as important objectives for the research and development of the behavioral technology component of REST while the author worked in humanitarian demining. PMID:22532731

  3. Applied behavior analysis is ideal for the development of a land mine detection technology using animals.

    PubMed

    Jones, B M

    2011-01-01

    The detection and subsequent removal of land mines and unexploded ordnance (UXO) from many developing countries are slow, expensive, and dangerous tasks, but have the potential to improve the well-being of millions of people. Consequently, those involved with humanitarian mine and UXO clearance are actively searching for new and more efficient detection technologies. Remote explosive scent tracing (REST) using trained dogs has the potential to be one such technology. However, details regarding how best to train, test, and deploy dogs in this role have never been made publicly available. This article describes how the key characteristics of applied behavior analysis, as described by Baer, Wolf and Risley (1968, 1987), served as important objectives for the research and development of the behavioral technology component of REST while the author worked in humanitarian demining. PMID:22532731

  4. Color changes in wood during heating: kinetic analysis by applying a time-temperature superposition method

    NASA Astrophysics Data System (ADS)

    Matsuo, Miyuki; Yokoyama, Misao; Umemura, Kenji; Gril, Joseph; Yano, Ken'ichiro; Kawai, Shuichi

    2010-04-01

    This paper deals with the kinetics of the color properties of hinoki ( Chamaecyparis obtusa Endl.) wood. Specimens cut from the wood were heated at 90-180°C as accelerated aging treatment. The specimens completely dried and heated in the presence of oxygen allowed us to evaluate the effects of thermal oxidation on wood color change. Color properties measured by a spectrophotometer showed similar behavior irrespective of the treatment temperature with each time scale. Kinetic analysis using the time-temperature superposition principle, which uses the whole data set, was successfully applied to the color changes. The calculated values of the apparent activation energy in terms of L *, a *, b *, and Δ E^{*}_{ab} were 117, 95, 114, and 113 kJ/mol, respectively, which are similar to the values of the literature obtained for other properties such as the physical and mechanical properties of wood.

  5. Applying independent component analysis to detect silent speech in magnetic resonance imaging signals.

    PubMed

    Abe, Kazuhiro; Takahashi, Toshimitsu; Takikawa, Yoriko; Arai, Hajime; Kitazawa, Shigeru

    2011-10-01

    Independent component analysis (ICA) can be usefully applied to functional imaging studies to evaluate the spatial extent and temporal profile of task-related brain activity. It requires no a priori assumptions about the anatomical areas that are activated or the temporal profile of the activity. We applied spatial ICA to detect a voluntary but hidden response of silent speech. To validate the method against a standard model-based approach, we used the silent speech of a tongue twister as a 'Yes' response to single questions that were delivered at given times. In the first task, we attempted to estimate one number that was chosen by a participant from 10 possibilities. In the second task, we increased the possibilities to 1000. In both tasks, spatial ICA was as effective as the model-based method for determining the number in the subject's mind (80-90% correct per digit), but spatial ICA outperformed the model-based method in terms of time, especially in the 1000-possibility task. In the model-based method, calculation time increased by 30-fold, to 15 h, because of the necessity of testing 1000 possibilities. In contrast, the calculation time for spatial ICA remained as short as 30 min. In addition, spatial ICA detected an unexpected response that occurred by mistake. This advantage was validated in a third task, with 13 500 possibilities, in which participants had the freedom to choose when to make one of four responses. We conclude that spatial ICA is effective for detecting the onset of silent speech, especially when it occurs unexpectedly.

  6. Applying Transactional Analysis and Personality Assessment to Improve Patient Counseling and Communication Skills

    PubMed Central

    Lawrence, Lesa

    2007-01-01

    Objective To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. Design A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. Assessment After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Conclusion Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients. PMID:17786269

  7. Active and passive infrared thermography applied to the detection and characterization of hidden defects in structure

    NASA Astrophysics Data System (ADS)

    Dumoulin, Jean

    2013-04-01

    direct thermal modelling or inverse thermal modelling will be presented and discussed. Conclusion and perspectives will be proposed in link with structure monitoring or cultural heritage applications. References [1] Maldague, X.P.V. "Theory and practice of infrared technology for non-destructive testing", John Wiley & sons Inc., 2001. [2] Dumoulin J. and Averty R., « Development of an infrared system coupled with a weather station for real time atmospheric corrections using GPU computing: Application to bridge monitoring", QIRT 2012, Naples, Italy, June 2012. [3] J. Dumoulin, L. Ibos, C. Ibarra-Castanedo, A Mazioud, M. Marchetti, X. Maldague and A. Bendada, « Active infrared thermography applied to defect detection and characterization on asphalt pavement samples: comparison between experiments and numerical simulations », Journal of Modern Optics, Special Issue on Advanced Infrared Technology and Applications, Volume 57, Issue 18, October 2010 , pages 1759 - 1769, doi:10.1080/09500340.2010.522738 [4] F. Taillade, M. Quiertant, K. Benzarti, J. Dumoulin, Ch. Aubagnac, Chapter 9: "Nondestructive Evaluation of FRP Strengthening Systems Bonded on Concrete Structures using Pulsed Stimulated Infrared Thermography ", pp 193-208, Book title "Infrared Thermography", Editeur Raghu V. Prakash, ISBN 978-953-51-0242-7, Intech, open access at the following address http://www.intechopen.com/books/editor/infrared-thermography, march 2012. [5] Cooley J.W., Tukey J.W., "An algorithm for the machine calculation of complex Fourier series", Mathematics of Computation, vol. 19, n° 90, 1965, p. 297-301. [6] Rajic N., "Principal component thermography for flaw contrast enhancement and flaw depth characterization in composite structures", Composite Structures, vol 58, pp 521-528, 2002. [7] Marinetti S., Grinzato E., Bison P. G., Bozzi E., Chimenti M., Pieri G. and Salvetti O. "Statistical analysis of IR thermographic sequences by PCA," Infrared Physics & Technology vol 46 pp 85-91, 2004.

  8. Research in progress in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  9. US--ITER activation analysis

    SciTech Connect

    Attaya, H.; Gohar, Y.; Smith, D.

    1990-09-01

    Activation analysis has been made for the US ITER design. The radioactivity and the decay heat have been calculated, during operation and after shutdown for the two ITER phases, the Physics Phase and the Technology Phase. The Physics Phase operates about 24 full power days (FPDs) at fusion power level of 1100 MW and the Technology Phase has 860 MW fusion power and operates for about 1360 FPDs. The point-wise gamma sources have been calculated everywhere in the reactor at several times after shutdown of the two phases and are then used to calculate the biological dose everywhere in the reactor. Activation calculations have been made also for ITER divertor. The results are presented for different continuous operation times and for only one pulse. The effect of the pulsed operation on the radioactivity is analyzed. 6 refs., 12 figs., 1 tab.

  10. Prompt-gamma activation analysis

    SciTech Connect

    Lindstrom, R.M.

    1993-01-01

    A permenent, full-time instrument for prompt-gamma activation analysis is nearing completion as part of the Cold Neutron Research Facility (CNRF). The design of the analytical system has been optimized for high gamma detection efficiency and low background, particularly for hydrogen. Because of the purity of the neutron beam, shielding requirements are modest and the scatter-capture background is low. As a result of a compact sample-detector geometry, the sensitivity (counting rate per gram of analyte) is a factor of four better than the existing Maryland-NIST thermal-neutron instrument at the reactor. Hydrogen backgrounds of a few micrograms have already been achieved, which promises to be of value in numerous applications where quantitative nondestructive analysis of small quantities of hydrogen in materials is necessary.

  11. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  12. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  13. A review of the technology and process on integrated circuits failure analysis applied in communications products

    NASA Astrophysics Data System (ADS)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  14. The spectrum of enzymes involved in activation of 2-aminoanthracene varies with the metabolic system applied.

    PubMed

    Veres, Zsuzsa; Török, Géza; Tóth, Eva; Vereczkey, László; Jemnitz, Katalin

    2005-09-01

    The aim of this study was to estimate the involvement of cytochrome P450s (CYPs) in the metabolic activation of 2-aminoanthracene (2AA) by use of metabolic systems such as liver S9 or hepatocytes from untreated and beta-naphthoflavone (BNF)- or phenobarbital (PB)-treated rats. Metabolic activation was determined in the Salmonella reverse mutation assay (Ames test). Unexpectedly, both enzyme inducers, BNF and PB, significantly decreased the mutagenicity of 2AA activated by S9 fractions. 2AA mutagenicity was detected in the presence of cytochrome P450 inhibitors such as alpha-naphthoflavone (ANF), clotrimazole and N-benzylimidazole to study the contribution of CYP isoenzymes to the activation process. ANF significantly decreased the activation of 2AA by S9 from untreated rats. In contrast, ANF significantly increased the metabolic activation of 2AA by S9 from BNF- and PB-treated rats. The enhanced mutagenicity was not altered by co-incubation with clotrimazole and ANF. Pre-incubation of 2AA in the presence of N-benzylimidazole significantly increased the activation of 2AA by S9 from BNF- and PB-treated rats, which suggests that CYPs play minor role in 2AA metabolic activation by rat liver S9 fractions. In contrast with the results described above, BNF treatment of rats significantly enhanced the activation of 2AA by hepatocytes. ANF attenuated the extent of this activation suggesting that different enzymes play a major role in the activation processes in these metabolic systems. Our results indicate that identification of mutagenic hazard by use of the Ames test may depend on the metabolic system applied.

  15. Normal vector analysis from GNSS-GPS data applied to Deception volcano surface deformation

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Prates, G.; Fernández-Ros, A.; García, A.

    2012-09-01

    Surface deformation parameters and its use in volcano monitoring have evolved from classical geodetic procedures up to those based on Global Navigation Satellite Systems (GNSS), in particular the most widely used and known Global Positioning System (GPS), profiting from the automated data processing, positioning precision and rates, as well as the large storage capacity and low power consumption of its equipments. These features have enabled the permanent GNSS-GPS data acquisition to ensure the continuous monitoring of geodetic benchmarks for the evaluation of surface deformation in active tectonic or volcanic areas. In Deception Island (Antarctica), a normal vector analysis is being used to give surface deformation based on three permanently observed GNSS-GPS benchmarks. Due to data availability, both in the past and for near real-time use, all benchmarks used are inside the monitored volcanic area, although the reference is away from thermal springs and/or fumaroles, unlike the other two. The time variation of slope distances to the reference benchmark and of the magnitude and inclination of the normal vector to the triangle defined by the reference benchmark and any other two, provides the spatial deformation in the volcanic area covered. The normal vector variation in magnitude gives information on compression or expansion, here called spatial dilatometer, while the changes in inclination gives information on relative uplift or subsidence, here called spatial inclinometer. In geodesy, the triangle is a basic geometric unit and the areal strain is commonly applied in tectonics and volcanism. The normal vector analysis conjugates both, benefiting from the method's precision, simplicity and possibility to model the surface using several triangles. The proposed method was applied to GNSS-GPS data collected every austral summer between 2001-2002 and 2009-2010 in Deception Island. The results evidence that Deception Island acts as a strain marker in the Bransfield

  16. Analysis of Phoenix Anomalies and IV & V Findings Applied to the GRAIL Mission

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    NASA IV&V was established in 1993 to improve safety and cost-effectiveness of mission critical software. Since its inception the tools and strategies employed by IV&V have evolved. This paper examines how lessons learned from the Phoenix project were developed and applied to the GRAIL project. Shortly after selection, the GRAIL project initiated a review of the issues documented by IV&V for Phoenix. The motivation was twofold: the learn as much as possible about the types of issues that arose from the flight software product line slated for use on GRAIL, and to identify opportunities for improving the effectiveness of IV&V on GRAIL. The IV&V Facility provided a database dump containing 893 issues. These were categorized into 16 bins, and then analyzed according to whether the project responded by changing the affected artifacts or using as-is. The results of this analysis were compared to a similar assessment of post-launch anomalies documented by the project. Results of the analysis were discussed with the IV&V team assigned to GRAIL. These discussions led to changes in the way both the project and IV&V approached the IV&V task, and improved the efficiency of the activity.

  17. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  18. Applied Behavior Analysis and the Imprisoned Adult Felon Project 1: The Cellblock Token Economy.

    ERIC Educational Resources Information Center

    Milan, Michael A.; And Others

    This report provides a technical-level analysis, discussion, and summary of five experiments in applied behavior analysis. Experiment 1 examined the token economy as a basis for motivating inmate behavior; Experiment 2 examined the relationship between magnitude of token reinforcement and level of inmate performance; Experiment 3 introduced a…

  19. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  20. Causal Modeling--Path Analysis a New Trend in Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Rastegar, Mina

    2006-01-01

    This article aims at discussing a new statistical trend in research in applied linguistics. This rather new statistical procedure is causal modeling--path analysis. The article demonstrates that causal modeling--path analysis is the best statistical option to use when the effects of a multitude of L2 learners' variables on language achievement are…

  1. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  2. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  3. Spherical harmonic decomposition applied to spatial-temporal analysis of human high-density electroencephalogram

    NASA Astrophysics Data System (ADS)

    Wingeier, B. M.; Nunez, P. L.; Silberstein, R. B.

    2001-11-01

    We demonstrate an application of spherical harmonic decomposition to the analysis of the human electroencephalogram (EEG). We implement two methods and discuss issues specific to the analysis of hemispherical, irregularly sampled data. Spatial sampling requirements and performance of the methods are quantified using simulated data. The analysis is applied to experimental EEG data, confirming earlier reports of an approximate frequency-wave-number relationship in some bands.

  4. Applied neuroanatomy elective to reinforce and promote engagement with neurosensory pathways using interactive and artistic activities.

    PubMed

    Dao, Vinh; Yeh, Pon-Hsiu; Vogel, Kristine S; Moore, Charleen M

    2015-01-01

    One in six Americans is currently affected by neurologic disease. As the United States population ages, the number of neurologic complaints is expected to increase. Thus, there is a pressing need for more neurologists as well as more neurology training in other specialties. Often interest in neurology begins during medical school, so improving education in medical neural courses is a critical step toward producing more neurologists and better neurology training in other specialists. To this end, a novel applied neuroanatomy elective was designed at the University of Texas Health Science Center at San Antonio (UTHSCSA) to complement the traditional first-year medical neuroscience course and promote engagement and deep learning of the material with a focus on neurosensory pathways. The elective covered four neurosensory modalities (proprioception/balance, vision, auditory, and taste/olfaction) over four sessions, each with a short classroom component and a much longer activity component. At each session, students reviewed the neurosensory pathways through structured presentations and then applied them to preplanned interactive activities, many of which allowed students to utilize their artistic talents. Students were required to complete subjective pre-course and post-course surveys and reflections. The survey results and positive student comments suggest that the elective was a valuable tool when used in parallel with the traditional medical neuroscience course in promoting engagement and reinforcement of the neurosensory material.

  5. Applied neuroanatomy elective to reinforce and promote engagement with neurosensory pathways using interactive and artistic activities.

    PubMed

    Dao, Vinh; Yeh, Pon-Hsiu; Vogel, Kristine S; Moore, Charleen M

    2015-01-01

    One in six Americans is currently affected by neurologic disease. As the United States population ages, the number of neurologic complaints is expected to increase. Thus, there is a pressing need for more neurologists as well as more neurology training in other specialties. Often interest in neurology begins during medical school, so improving education in medical neural courses is a critical step toward producing more neurologists and better neurology training in other specialists. To this end, a novel applied neuroanatomy elective was designed at the University of Texas Health Science Center at San Antonio (UTHSCSA) to complement the traditional first-year medical neuroscience course and promote engagement and deep learning of the material with a focus on neurosensory pathways. The elective covered four neurosensory modalities (proprioception/balance, vision, auditory, and taste/olfaction) over four sessions, each with a short classroom component and a much longer activity component. At each session, students reviewed the neurosensory pathways through structured presentations and then applied them to preplanned interactive activities, many of which allowed students to utilize their artistic talents. Students were required to complete subjective pre-course and post-course surveys and reflections. The survey results and positive student comments suggest that the elective was a valuable tool when used in parallel with the traditional medical neuroscience course in promoting engagement and reinforcement of the neurosensory material. PMID:24920370

  6. Combinatorial proteomic analysis of intercellular signaling applied to the CD28 T-cell costimulatory receptor

    PubMed Central

    Tian, Ruijun; Wang, Haopeng; Gish, Gerald D.; Petsalaki, Evangelia; Pasculescu, Adrian; Shi, Yu; Mollenauer, Marianne; Bagshaw, Richard D.; Yosef, Nir; Hunter, Tony; Gingras, Anne-Claude; Weiss, Arthur; Pawson, Tony

    2015-01-01

    Systematic characterization of intercellular signaling approximating the physiological conditions of stimulation that involve direct cell–cell contact is challenging. We describe a proteomic strategy to analyze physiological signaling mediated by the T-cell costimulatory receptor CD28. We identified signaling pathways activated by CD28 during direct cell–cell contact by global analysis of protein phosphorylation. To define immediate CD28 targets, we used phosphorylated forms of the CD28 cytoplasmic region to obtain the CD28 interactome. The interaction profiles of selected CD28-interacting proteins were further characterized in vivo for amplifying the CD28 interactome. The combination of the global phosphorylation and interactome analyses revealed broad regulation of CD28 and its interactome by phosphorylation. Among the cellular phosphoproteins influenced by CD28 signaling, CapZ-interacting protein (CapZIP), a regulator of the actin cytoskeleton, was implicated by functional studies. The combinatorial approach applied herein is widely applicable for characterizing signaling networks associated with membrane receptors with short cytoplasmic tails. PMID:25829543

  7. Combinatorial proteomic analysis of intercellular signaling applied to the CD28 T-cell costimulatory receptor.

    PubMed

    Tian, Ruijun; Wang, Haopeng; Gish, Gerald D; Petsalaki, Evangelia; Pasculescu, Adrian; Shi, Yu; Mollenauer, Marianne; Bagshaw, Richard D; Yosef, Nir; Hunter, Tony; Gingras, Anne-Claude; Weiss, Arthur; Pawson, Tony

    2015-03-31

    Systematic characterization of intercellular signaling approximating the physiological conditions of stimulation that involve direct cell-cell contact is challenging. We describe a proteomic strategy to analyze physiological signaling mediated by the T-cell costimulatory receptor CD28. We identified signaling pathways activated by CD28 during direct cell-cell contact by global analysis of protein phosphorylation. To define immediate CD28 targets, we used phosphorylated forms of the CD28 cytoplasmic region to obtain the CD28 interactome. The interaction profiles of selected CD28-interacting proteins were further characterized in vivo for amplifying the CD28 interactome. The combination of the global phosphorylation and interactome analyses revealed broad regulation of CD28 and its interactome by phosphorylation. Among the cellular phosphoproteins influenced by CD28 signaling, CapZ-interacting protein (CapZIP), a regulator of the actin cytoskeleton, was implicated by functional studies. The combinatorial approach applied herein is widely applicable for characterizing signaling networks associated with membrane receptors with short cytoplasmic tails. PMID:25829543

  8. Comparison of complexity measures using two complex system analysis methods applied to the epileptic ECoG

    NASA Astrophysics Data System (ADS)

    Janjarasjitt, Suparerk; Loparo, Kenneth A.

    2013-10-01

    A complex system analysis has been widely applied to examine the characteristics of an electroencephalogram (EEG) in health and disease, as well as the dynamics of the brain. In this study, two complexity measures, the correlation dimension and the spectral exponent, are applied to electrocorticogram (ECoG) data from subjects with epilepsy obtained during different states (seizure and non-seizure) and from different brain regions, and the complexities of ECoG data obtained during different states and from different brain regions are examined. From the computational results, the spectral exponent obtained from the wavelet-based fractal analysis is observed to provide information complementary to the correlation dimension derived from the nonlinear dynamical-systems analysis. ECoG data obtained during seizure activity have smoother temporal patterns and are less complex than data obtained during non-seizure activity. In addition, significant differences between these two ECoG complexity measures exist when applied to ECoG data obtained from different brain regions of subjects with epilepsy.

  9. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    PubMed

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection. PMID:24749274

  10. [Disinfection of water: on the need for analysis and solution of fundamental and applied problems].

    PubMed

    Mokienko, A V

    2014-01-01

    In the paper there is presented an analysis of hygienic--medical and environmental aspects of water disinfection as exemplified of chlorine and chlorine dioxide (CD). The concept of persistent multivariate risk for aquatic pathogens, the own vision of the mechanism of formation of chlorine resistance of bacteria under the influence of biocides based on a two-step process of information and spatial interaction of the receptor and the substrate, the hypothesis of hormetic stimulating effect of residual active chlorine (in the complex with other factors) on the growth of aquatic pathogens have been proposed. The aggravation of the significance of halogen containing compounds (HCC) as byproducts of water chlorination in terms of their potential danger as toxicants and carcinogens has been substantiated. Analysis of hygienic and medical and environmental aspects of the use of chlorine dioxide as a means of disinfection of water allowed to justify chemism of its biocidal effect and mechanisms of bactericidal, virucidal, protozoocidal, sporicidal, algacidal actions, removal of biofilms, formation of disinfection byproducts. Chlorine dioxide was shown both to provide epidemic safety of drinking water due to its high virucidal, bactericidal and mycocidal action and to be toxicologically harmless in the context of the influence on the organism of laboratory animals as well as in relation to aquatic organisms under the discharge of disinfected wastewater. There has proved the necessity of the close relationship of fundamental and applied research in performing the first in terms of depth study of microbiological, molecular genetic and epidemiological problems of disinfection (chlorination) of water and the implementation of the latters by means of the introduction of alternative, including combined, technologies for water treatment and disinfection.

  11. Activities for the Promotion of Gender Equality in Japan—Japan Society of Applied Physics

    NASA Astrophysics Data System (ADS)

    Kodate, Kashiko; Tanaka, Kazuo

    2005-10-01

    Since 1946, the Japan Society of Applied Physics (JSAP) has strived to promote research and development in applied physics for benefits beyond national boundaries. Activities of JSAP involve multidisciplinary fields, from physics and engineering to life sciences. Of its 23,000 members, 48% are from industry, 29% from academia, and about 7% from semi-autonomous national research laboratories. Its large industrial membership is one of the distinctive features of JSAP. In preparation for the First IUPAP International Conference on Women in Physics (Paris, 2002), JSAP members took the first step under the strong leadership of then-JSAP President Toshio Goto, setting up the Committee for the Promotion Equal Participation of Men and Women in Science and Technology. Equality rather than women's advancement is highlighted to further development in science and technology. Attention is also paid to balancing the number of researchers from different age groups and affiliations. The committee has 22 members: 12 female and 10 male; 7 from corporations, 12 from universities, and 3 from semi-autonomous national research institutes. Its main activities are to organize symposia and meetings, conduct surveys among JSAP members, and provide child-care facilities at meetings and conferences. In 2002 the Japan Physics Society and the Chemical Society of Japan jointly created the Japan Inter-Society Liaison Association for the Promotion of Equal Participation of Men and Women in Science and Engineering. Membership has grown to 44 societies (of which 19 are observers) ranging from mathematics, information, and life sciences to civil engineering. Joint activities across sectors and empower the whole. The Gender Equality Bureau in the Cabinet Office recently launched a large-scale project called "Challenge Campaign" to encourage girls to major in natural science and engineering, which JSAP is co-sponsoring.

  12. Ordinary and Activated Bone Grafts: Applied Classification and the Main Features

    PubMed Central

    Deev, R. V.; Drobyshev, A. Y.; Bozo, I. Y.; Isaev, A. A.

    2015-01-01

    Bone grafts are medical devices that are in high demand in clinical practice for substitution of bone defects and recovery of atrophic bone regions. Based on the analysis of the modern groups of bone grafts, the particularities of their composition, the mechanisms of their biological effects, and their therapeutic indications, applicable classification was proposed that separates the bone substitutes into “ordinary” and “activated.” The main differential criterion is the presence of biologically active components in the material that are standardized by qualitative and quantitative parameters: growth factors, cells, or gene constructions encoding growth factors. The pronounced osteoinductive and (or) osteogenic properties of activated osteoplastic materials allow drawing upon their efficacy in the substitution of large bone defects. PMID:26649300

  13. Design and Analysis of a Thrust Vector Mechanism Applied in a Flying Wing

    NASA Astrophysics Data System (ADS)

    Zhu, Yanhe; Gao, Liang; Wang, Hongwei; Zhao, Jie

    This paper presents the design and analysis of a thrust vector mechanism applied in a flying wing. A thrust vector mechanism driven by two servos is developed. An analysis of the dynamic differences in minimum hovering radius between conventional flying wing and one with thrust vector mechanism is given and validated with simulation. It is shown that thrust vector has obvious advantages over the usual flying wing including decreasing the hovering radius and decreasing roll angle. The benefits should improve maneuverability and agility.

  14. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    PubMed

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  15. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    PubMed Central

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  16. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    ERIC Educational Resources Information Center

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  17. Says Who?: Students Apply Their Critical-Analysis Skills to Fight Town Hall

    ERIC Educational Resources Information Center

    Trimarchi, Ruth

    2002-01-01

    For some time the author looked for a tool to let students apply what they are learning about critical analysis in the science classroom to a relevant life experience. The opportunity occurred when a proposal to use environmentally friendly cleaning products in town buildings appeared on the local town meeting agenda. Using a copy of the proposal…

  18. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  19. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  20. Applying Socio-Identity Analysis to Counseling Practice and Preparation: A Review of Four Techniques.

    ERIC Educational Resources Information Center

    Johnson, Samuel D., Jr.

    1990-01-01

    Reviews four training strategies for applying socioidentity analysis to multicultural counseling; the Clarification Group (C Group); the Personal Dimensions of Difference Self-Inventory (PDD); the Multifactor Needs Assessment; and the Cultural Grid. Each highlights a slightly different aspect of the complex matrix of relationships that define the…

  1. A Case Study in the Misrepresentation of Applied Behavior Analysis in Autism: The Gernsbacher Lectures

    PubMed Central

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894) This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled “The Science of Autism: Beyond the Myths and Misconceptions.” Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context. PMID:22478522

  2. Applied Behavior Analysis in the Treatment of Severe Psychiatric Disorders: A Bibliography.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    Clinical research in the area of severe psychiatric disorders constituted the major focus for the discipline of applied behavior analysis during the early 1960s. Recently, however, there appears to be a notable lack of a behavioral focus within many inpatient psychiatric settings and a relative dearth of published behavioral treatment studies with…

  3. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  4. Linear and non-linear control techniques applied to actively lubricated journal bearings

    NASA Astrophysics Data System (ADS)

    Nicoletti, R.; Santos, I. F.

    2003-03-01

    The main objectives of actively lubricated bearings are the simultaneous reduction of wear and vibration between rotating and stationary machinery parts. For reducing wear and dissipating vibration energy until certain limits, one can use the conventional hydrodynamic lubrication. For further reduction of shaft vibrations one can use the active lubrication action, which is based on injecting pressurized oil into the bearing gap through orifices machined in the bearing sliding surface. The design and efficiency of some linear (PD, PI and PID) and a non-linear controller, applied to a tilting-pad journal bearing, are analysed and discussed. Important conclusions about the application of integral controllers, responsible for changing the rotor-bearing equilibrium position and consequently the "passive" oil film damping coefficients, are achieved. Numerical results show an effective vibration reduction of unbalance response of a rigid rotor, where the PD and the non-linear P controllers show better performance for the frequency range of study (0-80 Hz). The feasibility of eliminating rotor-bearing instabilities (phenomena of whirl) by using active lubrication is also investigated, illustrating clearly one of its most promising applications.

  5. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    SciTech Connect

    Kolski, Jeffrey S.; Macek, Robert J.; McCrady, Rodney C.; Pang, Xiaoying

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  6. Modulation of motor unit activity in biceps brachii by neuromuscular electrical stimulation applied to the contralateral arm

    PubMed Central

    Mani, Diba; Almuklass, Awad; Matkowski, Boris; Gould, Jeffrey R.; Enoka, Roger M.

    2015-01-01

    The purpose of the study was to determine the influence of neuromuscular electrical stimulation (NMES) current intensity and pulse width applied to the right elbow flexors on the discharge characteristics of motor units in the left biceps brachii. Three NMES current intensities were applied for 5 s with either narrow (0.2 ms) or wide (1 ms) stimulus pulses: one at 80% of motor threshold and two that evoked contractions at either ∼10% or ∼20% of maximal voluntary contraction (MVC) force. The discharge times of 28 low-threshold (0.4–21.6% MVC force) and 16 high-threshold (31.7–56.3% MVC force) motor units in the short head of biceps brachii were determined before, during, and after NMES. NMES elicited two main effects: one involved transient deflections in the left-arm force at the onset and offset of NMES and the other consisted of nonuniform modulation of motor unit activity. The force deflections, which were influenced by NMES current intensity and pulse width, were observed only when low-threshold motor units were tracked. NMES did not significantly influence the discharge characteristics of tracked single-threshold motor units. However, a qualitative analysis indicated that there was an increase in the number of unique waveforms detected during and after NMES. The findings indicate that activity of motor units in the left elbow flexors can be modulated by NMES current and pulse width applied to right elbow flexors, but the effects are not distributed uniformly to the involved motor units. PMID:25930023

  7. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  8. The mechanics of motorised momentum exchange tethers when applied to active debris removal from LEO

    SciTech Connect

    Caldecott, Ralph; Kamarulzaman, Dayangku N. S.; Kirrane, James P.; Cartmell, Matthew P.; Ganilova, Olga A.

    2014-12-10

    The concept of momentum exchange when applied to space tethers for propulsion is well established, and a considerable body of literature now exists on the on-orbit modelling, the dynamics, and also the control of a large range of tether system applications. The authors consider here a new application for the Motorised Momentum Exchange Tether by highlighting three key stages of development leading to a conceptualisation that can subsequently be developed into a technology for Active Debris Removal. The paper starts with a study of the on-orbit mechanics of a full sized motorised tether in which it is shown that a laden and therefore highly massasymmetrical tether can still be forced to spin, and certainly to librate, thereby confirming its possible usefulness for active debris removal (ADR). The second part of the paper concentrates on the modelling of the centripetal deployment of a symmetrical MMET in order to get it initialized for debris removal operations, and the third and final part of the paper provides an entry into scale modelling for low cost mission design and testing. It is shown that the motorised momentum exchange tether offers a potential solution to the removal of large pieces of orbital debris, and that dynamic methodologies can be implemented to in order to optimise the emergent design.

  9. The mechanics of motorised momentum exchange tethers when applied to active debris removal from LEO

    NASA Astrophysics Data System (ADS)

    Caldecott, Ralph; Kamarulzaman, Dayangku N. S.; Kirrane, James P.; Cartmell, Matthew P.; Ganilova, Olga A.

    2014-12-01

    The concept of momentum exchange when applied to space tethers for propulsion is well established, and a considerable body of literature now exists on the on-orbit modelling, the dynamics, and also the control of a large range of tether system applications. The authors consider here a new application for the Motorised Momentum Exchange Tether by highlighting three key stages of development leading to a conceptualisation that can subsequently be developed into a technology for Active Debris Removal. The paper starts with a study of the on-orbit mechanics of a full sized motorised tether in which it is shown that a laden and therefore highly massasymmetrical tether can still be forced to spin, and certainly to librate, thereby confirming its possible usefulness for active debris removal (ADR). The second part of the paper concentrates on the modelling of the centripetal deployment of a symmetrical MMET in order to get it initialized for debris removal operations, and the third and final part of the paper provides an entry into scale modelling for low cost mission design and testing. It is shown that the motorised momentum exchange tether offers a potential solution to the removal of large pieces of orbital debris, and that dynamic methodologies can be implemented to in order to optimise the emergent design.

  10. Total body nitrogen analysis. [neutron activation analysis

    NASA Technical Reports Server (NTRS)

    Palmer, H. E.

    1975-01-01

    Studies of two potential in vivo neutron activation methods for determining total and partial body nitrogen in animals and humans are described. A method using the CO-11 in the expired air as a measure of nitrogen content was found to be adequate for small animals such as rats, but inadequate for human measurements due to a slow excretion rate. Studies on the method of measuring the induced N-13 in the body show that with further development, this method should be adequate for measuring muscle mass changes occurring in animals or humans during space flight.

  11. Applying behavior analysis to clinical problems: review and analysis of habit reversal.

    PubMed Central

    Miltenberger, R G; Fuqua, R W; Woods, D W

    1998-01-01

    This article provides a review and analysis of habit reversal, a multicomponent procedure developed by Azrin and Nunn (1973, 1974) for the treatment of nervous habits, tics, and stuttering. The article starts with a discussion of the behaviors treated with habit reversal, behavioral covariation among habits, and functional analysis and assessment of habits. Research on habit reversal and simplified versions of the procedure is then described. Next the article discusses the limitations of habit reversal and the evidence for its generality. The article concludes with an analysis of the behavioral processes involved in habit reversal and suggestions for future research. PMID:9757583

  12. A Study in the Founding of Applied Behavior Analysis Through Its Publications

    PubMed Central

    Morris, Edward K.; Altus, Deborah E.; Smith, Nathaniel G.

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research. PMID:25729133

  13. GPU-BASED MONTE CARLO DUST RADIATIVE TRANSFER SCHEME APPLIED TO ACTIVE GALACTIC NUCLEI

    SciTech Connect

    Heymann, Frank; Siebenmorgen, Ralf

    2012-05-20

    A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman and Wood method to reduce the calculation time, and the Fleck and Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 {mu}m silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.

  14. GPU-based Monte Carlo Dust Radiative Transfer Scheme Applied to Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Heymann, Frank; Siebenmorgen, Ralf

    2012-05-01

    A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman & Wood method to reduce the calculation time, and the Fleck & Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 μm silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.

  15. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  16. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    PubMed

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  17. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  18. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    SciTech Connect

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  19. Geospatial analysis applied to epidemiological studies of dengue: a systematic review.

    PubMed

    Oliveira, Maria Aparecida de; Ribeiro, Helena; Castillo-Salgado, Carlos

    2013-12-01

    A systematic review of the geospatial analysis methods used in the dengue fever studies published between January 2001 and March 2011 was undertaken. In accordance with specific selection criteria thirty-five studies were selected for inclusion in the review. The aim was to assess the types of spatial methods that have been used to analyze dengue transmission. We found twenty-one different methods that had been used in dengue fever epidemiological studies in that period, three of which were most frequently used. The results show that few articles had applied spatial analysis methods in dengue fever studies; however, whenever they were applied they contributed to a better understanding of dengue fever geospatial diffusion.

  20. 20 CFR 667.274 - What health and safety standards apply to the working conditions of participants in activities...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What health and safety standards apply to the...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR ADMINISTRATIVE PROVISIONS UNDER TITLE... and safety standards apply to the working conditions of participants in activities under title I...

  1. 20 CFR 667.274 - What health and safety standards apply to the working conditions of participants in activities...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false What health and safety standards apply to the...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR ADMINISTRATIVE PROVISIONS UNDER TITLE... and safety standards apply to the working conditions of participants in activities under title I...

  2. Activation of Schwann cells in vitro by magnetic nanocomposites via applied magnetic field

    PubMed Central

    Liu, Zhongyang; Huang, Liangliang; Liu, Liang; Luo, Beier; Liang, Miaomiao; Sun, Zhen; Zhu, Shu; Quan, Xin; Yang, Yafeng; Ma, Teng; Huang, Jinghui; Luo, Zhuojing

    2015-01-01

    Schwann cells (SCs) are attractive seed cells in neural tissue engineering, but their application is limited by attenuated biological activities and impaired functions with aging. Therefore, it is important to explore an approach to enhance the viability and biological properties of SCs. In the present study, a magnetic composite made of magnetically responsive magnetic nanoparticles (MNPs) and a biodegradable chitosan–glycerophosphate polymer were prepared and characterized. It was further explored whether such magnetic nanocomposites via applied magnetic fields would regulate SC biological activities. The magnetization of the magnetic nanocomposite was measured by a vibrating sample magnetometer. The compositional characterization of the magnetic nanocomposite was examined by Fourier-transform infrared and X-ray diffraction. The tolerance of SCs to the magnetic fields was tested by flow-cytometry assay. The proliferation of cells was examined by a 5-ethynyl-2-deoxyuridine-labeling assay, a PrestoBlue assay, and a Live/Dead assay. Messenger ribonucleic acid of BDNF, GDNF, NT-3, and VEGF in SCs was assayed by quantitative real-time polymerase chain reaction. The amount of BDNF, GDNF, NT-3, and VEGF secreted from SCs was determined by enzyme-linked immunosorbent assay. It was found that magnetic nanocomposites containing 10% MNPs showed a cross-section diameter of 32.33±1.81 µm, porosity of 80.41%±0.72%, and magnetization of 5.691 emu/g at 8 kOe. The 10% MNP magnetic nanocomposites were able to support cell adhesion and spreading and further promote proliferation of SCs under magnetic field exposure. Interestingly, a magnetic field applied through the 10% MNP magnetic scaffold significantly increased the gene expression and protein secretion of BDNF, GDNF, NT-3, and VEGF. This work is the first stage in our understanding of how to precisely regulate the viability and biological properties of SCs in tissue-engineering grafts, which combined with additional

  3. Activation of Schwann cells in vitro by magnetic nanocomposites via applied magnetic field.

    PubMed

    Liu, Zhongyang; Huang, Liangliang; Liu, Liang; Luo, Beier; Liang, Miaomiao; Sun, Zhen; Zhu, Shu; Quan, Xin; Yang, Yafeng; Ma, Teng; Huang, Jinghui; Luo, Zhuojing

    2015-01-01

    Schwann cells (SCs) are attractive seed cells in neural tissue engineering, but their application is limited by attenuated biological activities and impaired functions with aging. Therefore, it is important to explore an approach to enhance the viability and biological properties of SCs. In the present study, a magnetic composite made of magnetically responsive magnetic nanoparticles (MNPs) and a biodegradable chitosan-glycerophosphate polymer were prepared and characterized. It was further explored whether such magnetic nanocomposites via applied magnetic fields would regulate SC biological activities. The magnetization of the magnetic nanocomposite was measured by a vibrating sample magnetometer. The compositional characterization of the magnetic nanocomposite was examined by Fourier-transform infrared and X-ray diffraction. The tolerance of SCs to the magnetic fields was tested by flow-cytometry assay. The proliferation of cells was examined by a 5-ethynyl-2-deoxyuridine-labeling assay, a PrestoBlue assay, and a Live/Dead assay. Messenger ribonucleic acid of BDNF, GDNF, NT-3, and VEGF in SCs was assayed by quantitative real-time polymerase chain reaction. The amount of BDNF, GDNF, NT-3, and VEGF secreted from SCs was determined by enzyme-linked immunosorbent assay. It was found that magnetic nanocomposites containing 10% MNPs showed a cross-section diameter of 32.33±1.81 µm, porosity of 80.41%±0.72%, and magnetization of 5.691 emu/g at 8 kOe. The 10% MNP magnetic nanocomposites were able to support cell adhesion and spreading and further promote proliferation of SCs under magnetic field exposure. Interestingly, a magnetic field applied through the 10% MNP magnetic scaffold significantly increased the gene expression and protein secretion of BDNF, GDNF, NT-3, and VEGF. This work is the first stage in our understanding of how to precisely regulate the viability and biological properties of SCs in tissue-engineering grafts, which combined with additional

  4. Applying the least restrictive alternative principle to treatment decisions: A legal and behavioral analysis

    PubMed Central

    Johnston, J. M.; Sherman, Robert A.

    1993-01-01

    The least restrictive alternative concept is widely used in mental health law. This paper addresses how the concept has been applied to treatment decisions. The paper offers both a legal and a behavioral analysis to some problems that have emerged in recent years concerning the selection of behavioral procedures used to change client behavior. The paper also offers ways of improving the application of the concept, which involve developing a more behaviorally functional perspective toward restrictiveness. PMID:22478138

  5. High-resolution frequency analysis as applied to the singing voice.

    PubMed

    Morsomme, D; Remacle, M; Millet, B

    1993-01-01

    We have applied high-resolution vocal frequent analysis to a population of singing voices. Two important elements have become apparent: (1) Confirmation that the singing formant originates in the resonators. This is observed especially on a low fundamental, and it is acquired through technical skill and experience. (2) Observation of the vibrato, which, isolated from the clinical study, regarding only its graphic presentation, could have been interpreted as 'abnormal'. PMID:8253452

  6. Analysis of active renin heterogeneity.

    PubMed

    Katz, S A; Malvin, R L; Lee, J; Kim, S H; Murray, R D; Opsahl, J A; Abraham, P A

    1991-09-01

    Active renin is a heterogeneous enzyme that can be separated into multiple forms with high-resolution isoelectric focusing. The isoelectric heterogeneity may result from differences in glycosylation between the different forms. In order to determine the relationship between active renin heterogeneity and differences in composition or attachment of oligosaccharides, two separate experiments were performed: (i) Tunicamycin, which interferes with normal glycosylation processing, increased the proportion of relatively basic renin forms secreted into the incubation media by rat renal cortical slices. (ii) Endoglycosidase F, which enzymatically removes carbohydrate from some classes of glycoprotein, similarly increased the proportion of relatively basic forms when incubated with active human recombinant renin. In addition, further studies with inhibitors of human renin activity revealed that the heterogeneous renin forms were similarly inhibited by two separate renin inhibitors. These results are consistent with the hypothesis that renin isoelectric heterogeneity is due in part to differences in carbohydrate moiety attachment and that the heterogeneity of renin does not influence access of direct renin inhibitors to the active site of renin.

  7. Analysis of active renin heterogeneity.

    PubMed

    Katz, S A; Malvin, R L; Lee, J; Kim, S H; Murray, R D; Opsahl, J A; Abraham, P A

    1991-09-01

    Active renin is a heterogeneous enzyme that can be separated into multiple forms with high-resolution isoelectric focusing. The isoelectric heterogeneity may result from differences in glycosylation between the different forms. In order to determine the relationship between active renin heterogeneity and differences in composition or attachment of oligosaccharides, two separate experiments were performed: (i) Tunicamycin, which interferes with normal glycosylation processing, increased the proportion of relatively basic renin forms secreted into the incubation media by rat renal cortical slices. (ii) Endoglycosidase F, which enzymatically removes carbohydrate from some classes of glycoprotein, similarly increased the proportion of relatively basic forms when incubated with active human recombinant renin. In addition, further studies with inhibitors of human renin activity revealed that the heterogeneous renin forms were similarly inhibited by two separate renin inhibitors. These results are consistent with the hypothesis that renin isoelectric heterogeneity is due in part to differences in carbohydrate moiety attachment and that the heterogeneity of renin does not influence access of direct renin inhibitors to the active site of renin. PMID:1908097

  8. Classical linear-control analysis applied to business-cycle dynamics and stability

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  9. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  10. Current research activities: Applied and numerical mathematics, fluid mechanics, experiments in transition and turbulence and aerodynamics, and computer science

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.

  11. Neutron activation analysis of a penny

    NASA Astrophysics Data System (ADS)

    Stevens, Richard E.

    2000-04-01

    Neutron activation analysis has been used for many years as an analysis tool and as an educational tool to teach students about nuclear properties. This article presents an exercise in the neutron activation analysis of a penny which, due to the simplicity of the resulting gamma-ray spectra, is appropriate for general physics classes. Students express a great deal of interest both in seeing the reactor in use as well as determining the composition of something that is familiar to them.

  12. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    PubMed

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  13. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    NASA Technical Reports Server (NTRS)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  14. LACBWR primary shield activation analysis

    SciTech Connect

    Nelson, L.L.; Lahti, G.P.; Johnson, W.J.

    1996-11-01

    Nuclear power plants in the US are required to estimate the costs of decommissioning to ensure that adequate funds are accumulated during the useful life of the plant. A major component of the decommissioning cost is the disposal of radioactive material, including material near the reactor created by neutron activation. An accurate assessment of the residual radioactivity in the reactor`s primary shield is necessary to determine this portion of the decommissioning demolition and disposal cost. This paper describes the efforts used to determine the activation levels remaining in the primary shield of the LaCrosse boiling water reactor (LACBWR), owned and operated by Dairyland Power Cooperative.

  15. Characterisation and calibration of active sampling Solid Phase Microextraction applied to sensitive determination of gaseous carbonyls.

    PubMed

    Gómez Alvarez, Elena; Moreno, Mónica Vázquez; Gligorovski, Sasho; Wortham, Henri; Cases, Miguel Valcárcel

    2012-01-15

    A characterisation of a system designed for active sampling of gaseous compounds with Solid Phase Microextraction (SPME) fibres is described. This form of sampling is useful to automate sampling while considerably reducing the sampling times. However, the efficiency of this form of sampling is also prone to be affected by certain undesirable effects such as fibre saturation, competition or displacement effects between analytes, to which particular attention should be paid especially at high flow rates. Yet, the effect of different parameters on the quantitivity of the results has not been evaluated. For this reason, in this study a careful characterisation of the influence of the parameters involved in active sampling SPME has been performed. A versatile experimental set-up has been designed to test the influence of air velocities and fluid regime on the quantitivity and reproducibility of the results. The mathematical model applied to the calculation of physical parameters at the sampling points takes into consideration the inherent characteristics of gases, distinctive from liquids and makes use of easily determined experimental variables as initial/boundary conditions to get the model started. The studies were carried out in the high-volume outdoor environmental chambers, EUPHORE. The sample subjected to study was a mixture of three aldehydes: pentanal, hexanal and heptanal and the determination methodology was O-(2,3,4,5,6-pentafluorobenzyl)-hydroxylamine hydrochloride (PFBHA) on-fibre derivatisation. The present work proves that the determination procedure is quantitative and sensitive, independent from experimental conditions: temperature, relative humidity or ozone levels. With our methodology, the influence on adsorption of three inter-related variables, i.e., air velocity, flow rate and Reynolds numbers can be separated, since a change can be exerted in one of them while keeping the others constant.

  16. Conference on Instrumental Activation Analysis: IAA 89

    NASA Astrophysics Data System (ADS)

    Vobecky, M.; Obrusnik, I.

    1989-05-01

    The proceedings contain 40 abstracts of papers all of which have been incorporated in INIS. The papers were centred on the applications of radioanalytical methods, especially on neutron activation analysis, x ray fluorescence analysis, PIXE analysis and tracer techniques in biology, medicine and metallurgy, measuring instruments including microcomputers, and data processing methods.

  17. Physical basis for prompt-neutron activation analysis

    SciTech Connect

    Chrien, R.E.

    1982-01-01

    The technique called prompt ..gamma..-ray neutron activation analysis has been applied to rapid materials analysis. The radiation following the neutron radiation capture is prompt in the sense that the nuclear decay time is on the order of 10/sup -15/ second, and thus the technique is not strictly activation, but should be called radiation neutron capture spectroscopy or neutron capture ..gamma..-ray spectroscopy. This paper reviews the following: sources and detectors, theory of radiative capture, nonstatistical capture, giant dipole resonance, fast neutron capture, and thermal neutron capture ..gamma..-ray spectra. 14 figures.

  18. A review of dendrogeomorphological research applied to flood risk analysis in Spain

    NASA Astrophysics Data System (ADS)

    Díez-Herrero, A.; Ballesteros, J. A.; Ruiz-Villanueva, V.; Bodoque, J. M.

    2013-08-01

    Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost-benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.

  19. Preliminary Work Domain Analysis for Human Extravehicular Activity

    NASA Technical Reports Server (NTRS)

    McGuire, Kerry; Miller, Matthew; Feigh, Karen

    2015-01-01

    A work domain analysis (WDA) of human extravehicular activity (EVA) is presented in this study. A formative methodology such as Cognitive Work Analysis (CWA) offers a new perspective to the knowledge gained from the past 50 years of living and working in space for the development of future EVA support systems. EVA is a vital component of human spaceflight and provides a case study example of applying a work domain analysis (WDA) to a complex sociotechnical system. The WDA presented here illustrates how the physical characteristics of the environment, hardware, and life support systems of the domain guide the potential avenues and functional needs of future EVA decision support system development.

  20. Activation Likelihood Estimation meta-analysis revisited

    PubMed Central

    Eickhoff, Simon B.; Bzdok, Danilo; Laird, Angela R.; Kurth, Florian; Fox, Peter T.

    2011-01-01

    A widely used technique for coordinate-based meta-analysis of neuroimaging data is activation likelihood estimation (ALE), which determines the convergence of foci reported from different experiments. ALE analysis involves modelling these foci as probability distributions whose width is based on empirical estimates of the spatial uncertainty due to the between-subject and between-template variability of neuroimaging data. ALE results are assessed against a null-distribution of random spatial association between experiments, resulting in random-effects inference. In the present revision of this algorithm, we address two remaining drawbacks of the previous algorithm. First, the assessment of spatial association between experiments was based on a highly time-consuming permutation test, which nevertheless entailed the danger of underestimating the right tail of the null-distribution. In this report, we outline how this previous approach may be replaced by a faster and more precise analytical method. Second, the previously applied correction procedure, i.e. controlling the false discovery rate (FDR), is supplemented by new approaches for correcting the family-wise error rate and the cluster-level significance. The different alternatives for drawing inference on meta-analytic results are evaluated on an exemplary dataset on face perception as well as discussed with respect to their methodological limitations and advantages. In summary, we thus replaced the previous permutation algorithm with a faster and more rigorous analytical solution for the null-distribution and comprehensively address the issue of multiple-comparison corrections. The proposed revision of the ALE-algorithm should provide an improved tool for conducting coordinate-based meta-analyses on functional imaging data. PMID:21963913

  1. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. PMID:26304361

  2. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis.

  3. Quality analysis of the solution produced by dissection algorithms applied to the traveling salesman problem

    SciTech Connect

    Cesari, G.

    1994-12-31

    The aim of this paper is to analyze experimentally the quality of the solution obtained with dissection algorithms applied to the geometric Traveling Salesman Problem. Starting from Karp`s results. We apply a divide and conquer strategy, first dividing the plane into subregions where we calculate optimal subtours and then merging these subtours to obtain the final tour. The analysis is restricted to problem instances where points are uniformly distributed in the unit square. For relatively small sets of cities we analyze the quality of the solution by calculating the length of the optimal tour and by comparing it with our approximate solution. When the problem instance is too large we perform an asymptotical analysis estimating the length of the optimal tour. We apply the same dissection strategy also to classical heuristics by calculating approximate subtours and by comparing the results with the average quality of the heuristic. Our main result is the estimate of the rate of convergence of the approximate solution to the optimal solution as a function of the number of dissection steps, of the criterion used for the plane division and of the quality of the subtours. We have implemented our programs on MUSIC (MUlti Signal processor system with Intelligent Communication), a Single-Program-Multiple-Data parallel computer with distributed memory developed at the ETH Zurich.

  4. 45 CFR 100.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false What procedures apply to the selection of programs and activities under these regulations? 100.6 Section 100.6 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF HEALTH AND HUMAN SERVICES PROGRAMS AND ACTIVITIES § 100.6...

  5. Applying quantitative structure-activity relationship (QSAR) methodology for modeling postmortem redistribution of benzodiazepines and tricyclic antidepressants.

    PubMed

    Giaginis, Constantinos; Tsantili-Kakoulidou, Anna; Theocharis, Stamatios

    2014-06-01

    Postmortem redistribution (PMR) constitutes a multifaceted process, which complicates the interpretation of drug concentrations by forensic toxicologists. The present study aimed to apply quantitative structure-activity relationship (QSAR) analysis for modeling PMR data of structurally related drugs, 10 benzodiazepines and 10 tricyclic antidepressants. For benzodiazepines, an adequate QSAR model was obtained (R(2) = 0.98, Q(2) = 0.88, RMSEE = 0.12), in which energy, ionization and molecular size exerted significant impact. For tricyclic antidepressants, an adequate QSAR model with slightly inferior statistics (R(2) = 0.95, Q(2) = 0.87, RMSEE = 0.29) was established after exclusion of maprotiline, in which energy parameters, basicity character and lipophilicity exerted significant contribution. Thus, QSAR analysis could be used as a complementary tool to provide an informative illustration of the contributing molecular, physicochemical and structural properties in PMR process. However, the complexity, non-static and time-dependent nature of PMR endpoints raises serious concerns whether QSAR methodology could predict the degree of redistribution, highlighting the need for animal-derived PMR data.

  6. How Can Insights from Conversation Analysis Be Directly Applied to Teaching L2 Pragmatics?

    ERIC Educational Resources Information Center

    Huth, Thorsten; Taleghani-Nikazm, Carmen

    2006-01-01

    This paper revisits the question of why pragmatics should be taught in the foreign language classroom and demonstrates how this can be achieved effectively with materials informed by conversation analysis (CA). Since findings in CA describe systematic action sequences underlying verbal activities that display cross-cultural variation, they capture…

  7. RS-34 Phoenix In-Space Propulsion System Applied to Active Debris Removal Mission

    NASA Technical Reports Server (NTRS)

    Esther, Elizabeth A.; Burnside, Christopher G.

    2014-01-01

    In-space propulsion is a high percentage of the cost when considering Active Debris Removal mission. For this reason it is desired to research if existing designs with slight modification would meet mission requirements to aid in reducing cost of the overall mission. Such a system capable of rendezvous, close proximity operations, and de-orbit of Envisat class resident space objects has been identified in the existing RS-34 Phoenix. RS-34 propulsion system is a remaining asset from the de-commissioned United States Air Force Peacekeeper program; specifically the pressure-fed storable bi-propellant Stage IV Post Boost Propulsion System. The National Aeronautics and Space Administration (NASA) Marshall Space Flight Center (MSFC) gained experience with the RS-34 propulsion system on the successful Ares I-X flight test program flown in the Ares I-X Roll control system (RoCS). The heritage hardware proved extremely robust and reliable and sparked interest for further utilization on other potential in-space applications. Subsequently, MSFC has obtained permission from the USAF to obtain all the remaining RS-34 stages for re-use opportunities. The MSFC Advanced Concepts Office (ACO) was commissioned to lead a study for evaluation of the Rocketdyne produced RS-34 propulsion system as it applies to an active debris removal design reference mission for resident space object targets including Envisat. Originally designed, the RS-34 Phoenix provided in-space six-degrees-of freedom operational maneuvering to deploy payloads at multiple orbital locations. The RS-34 Concept Study lead by sought to further understand application for a similar orbital debris design reference mission to provide propulsive capability for rendezvous, close proximity operations to support the capture phase of the mission, and deorbit of single or multiple large class resident space objects. Multiple configurations varying the degree of modification were identified to trade for dry mass optimization and

  8. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    SciTech Connect

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%.

  9. Applying latent semantic analysis to large-scale medical image databases.

    PubMed

    Stathopoulos, Spyridon; Kalamboukis, Theodore

    2015-01-01

    Latent Semantic Analysis (LSA) although has been used successfully in text retrieval when applied to CBIR induces scalability issues with large image collections. The method so far has been used with small collections due to the high cost of storage and computational time for solving the SVD problem for a large and dense feature matrix. Here we present an effective and efficient approach of applying LSA skipping the SVD solution of the feature matrix and overcoming in this way the deficiencies of the method with large scale datasets. Early and late fusion techniques are tested and their performance is calculated. The study demonstrates that early fusion of several composite descriptors with visual words increase retrieval effectiveness. It also combines well in a late fusion for mixed (textual and visual) ad hoc and modality classification. The results reported are comparable to state of the art algorithms without including additional knowledge from the medical domain. PMID:24934416

  10. An uncertainty analysis of the PVT gauging method applied to sub-critical cryogenic propellant tanks

    NASA Astrophysics Data System (ADS)

    Van Dresar, Neil T.

    2004-06-01

    The PVT (pressure, volume, temperature) method of liquid quantity gauging in low-gravity is based on gas law calculations assuming conservation of pressurant gas within the propellant tank and the pressurant supply bottle. There is interest in applying this method to cryogenic propellant tanks since the method requires minimal additional hardware or instrumentation. To use PVT with cryogenic fluids, a non-condensable pressurant gas (helium) is required. With cryogens, there will be a significant amount of propellant vapor mixed with the pressurant gas in the tank ullage. This condition, along with the high sensitivity of propellant vapor pressure to temperature, makes the PVT method susceptible to substantially greater measurement uncertainty than is the case with less volatile propellants. A conventional uncertainty analysis is applied to example cases of liquid hydrogen and liquid oxygen tanks. It appears that the PVT method may be feasible for liquid oxygen. Acceptable accuracy will be more difficult to obtain with liquid hydrogen.

  11. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    PubMed

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically.

  12. Effects of Applying STR for Group Learning Activities on Learning Performance in a Synchronous Cyber Classroom

    ERIC Educational Resources Information Center

    Kuo, Tony C. T.; Shadiev, Rustam; Hwang, Wu-Yuin; Chen, Nian-Shing

    2012-01-01

    This study aimed to apply Speech to Text Recognition (STR) for individual oral presentations and group discussions of students in a synchronous cyber classroom. An experiment was conducted to analyze the effectiveness of applying STR on learning performance. Students' perceptions and behavioral intentions toward using STR were also investigated.…

  13. 45 CFR 91.3 - To what programs or activities do these regulations apply?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... regulations apply? 91.3 Section 91.3 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL... apply to: (1) An age distinction contained in that part of a Federal, State, or local statute or ordinance adopted by an elected, general purpose legislative body which: (i) Provides any benefits...

  14. 5 CFR 875.206 - As a new active workforce member, when may I apply?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... last open season for enrollment before the date of your application. (d) After the 60-day period ends... a position that conveys eligibility, you may apply for coverage within 60 days after becoming... position that did not convey eligibility, you may apply for coverage within 60 days after becoming...

  15. Analysis of possibility of applying the PVDF foil in industrial vibration sensors

    NASA Astrophysics Data System (ADS)

    Wróbel, A.

    2015-11-01

    There are many machines using the piezoelectric effects. Systems with smart materials are often used because they have high potential applications for example transducers can be applied to receive required characteristic of projected system. Every engineer and designer know how important it is properly mathematical model and method of the analysis. Also it is important to consider all parameters of analyzed system for example glue layer between elements. Geometrical and material parameters has a significant impact on the characteristics of the all system's components because the omission of the influence of one of them results in inaccuracy in the analysis of the system. In article the modeling and testing of vibrating systems with piezoelectric ceramic materials transducers used as actuators and vibration dampers. The method of analysis of the vibrating sensor systems will be presented, mathematical model, and characteristics, to determine the influence of the system's properties on these characteristics. Main scientific point of the project is to analyze and demonstrate possibility of applying new construction with the PVDF foil or any other belonging to a group of smart materials in industrial sensors. Currently, the vibration level sensors are used by practically all manufacturers of piezoelectric ceramic plates to generate and detect the vibration of the fork.

  16. X-ray microfluorescence with synchrotron radiation applied in the analysis of pigments from ancient Egypt

    NASA Astrophysics Data System (ADS)

    Calza, C.; Anjos, M. J.; Mendonça de Souza, S. M. F.; Brancaglion, A., Jr.; Lopes, R. T.

    2008-01-01

    In this work, X-ray microfluorescence with the synchrotron radiation technique was applied in the analysis of pigments found in decorative paintings in the sarcophagus of an Egyptian mummy. This female mummy, from the Roman Period, which was embalmed with the arms and legs swathed separately is considered one of the most important pieces of the Egyptian Collection from the National Museum (Rio de Janeiro, Brazil). The measurements were performed at the XRF beamline D09B of the Brazilian Synchrotron Light Laboratory (LNLS), using the white beam and a Si(Li) detector with resolution of 165 eV at 5.9 keV. The possible pigments found in the samples were: Egyptian blue, Egyptian green frit, green earth, verdigris, malachite, ochre, realgar, chalk, gypsum, bone white, ivory black and magnetite. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) were applied to the results in order to verify if the samples belong to the same period of a linen wrapping fragment, whose provenance was well established.

  17. Analysis of Preconditioning and Relaxation Operators for the Discontinuous Galerkin Method Applied to Diffusion

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.; Shu, Chi-Wang

    2001-01-01

    The explicit stability constraint of the discontinuous Galerkin method applied to the diffusion operator decreases dramatically as the order of the method is increased. Block Jacobi and block Gauss-Seidel preconditioner operators are examined for their effectiveness at accelerating convergence. A Fourier analysis for methods of order 2 through 6 reveals that both preconditioner operators bound the eigenvalues of the discrete spatial operator. Additionally, in one dimension, the eigenvalues are grouped into two or three regions that are invariant with order of the method. Local relaxation methods are constructed that rapidly damp high frequencies for arbitrarily large time step.

  18. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  19. A Sensor Fault Detection Methodology applied to Piezoelectric Active Systems in Structural Health Monitoring Applications

    NASA Astrophysics Data System (ADS)

    Tibaduiza, D.; Anaya, M.; Forero, E.; Castro, R.; Pozo, F.

    2016-07-01

    Damage detection is the basis of the damage identification task in Structural Health Monitoring. A good damage detection process can ensure the adequate work of a SHM System because allows to know early information about the presence of a damage in a structure under evaluation. However this process is based on the premise that all sensors are well installed and they are working properly, however, it is not true all the time. Problems such as debonding, cuts and the use of the sensors under different environmental and operational conditions result in changes in the vibrational response and a bad functioning in the SHM system. As a contribution to evaluate the state of the sensors in a SHM system, this paper describes a methodology for sensor fault detection in a piezoelectric active system. The methodology involves the use of PCA for multivariate analysis and some damage indices as pattern recognition technique and is tested in a blade from a wind turbine where different scenarios are evaluated including sensor cuts and debonding.

  20. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  1. Neutron Activation Analysis of Water - A Review

    NASA Technical Reports Server (NTRS)

    Buchanan, John D.

    1971-01-01

    Recent developments in this field are emphasized. After a brief review of basic principles, topics discussed include sources of neutrons, pre-irradiation physical and chemical treatment of samples, neutron capture and gamma-ray analysis, and selected applications. Applications of neutron activation analysis of water have increased rapidly within the last few years and may be expected to increase in the future.

  2. Escalation research: Providing new frontiers for applying behavior analysis to organizational behavior

    PubMed Central

    Goltz, Sonia M.

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to “throw good money after bad,” can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis. PMID:22478347

  3. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  4. Escalation research: providing new frontiers for applying behavior analysis to organizational behavior.

    PubMed

    Goltz, S M

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to "throw good money after bad," can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis.

  5. Analysis of sonic well logs applied to erosion estimates in the Bighorn Basin, Wyoming

    SciTech Connect

    Heasler, H.P.; Kharitonova, N.A.

    1996-05-01

    An improved exponential model of sonic transit time data as a function of depth takes into account the physical range of rock sonic velocities. In this way, the model is more geologically realistic for predicting compaction trends when compared to linear or simple exponential functions that fail at large depth intervals. The improved model is applied to the Bighorn basin of northwestern Wyoming for calculation of erosion amounts. This basin was chosen because of extensive geomorphic research that constrains erosion models and because of the importance of quantifying erosion amounts for basin analysis and hydrocarbon maturation prediction. Thirty-six wells were analyzed using the improved exponential model. Seven of these wells, due to limited data from the Tertiary section, were excluded from the basin erosion analysis. Erosion amounts from the remaining 29 wells ranged from 0 to 5600 ft (1700 m), with an average of 2500 ft (800 m).

  6. Graphical Analysis of PET Data Applied to Reversible and Irreversible Tracers

    SciTech Connect

    Logan, Jean

    1999-11-18

    Graphical analysis refers to the transformation of multiple time measurements of plasma and tissue uptake data into a linear plot, the slope of which is related to the number of available tracer binding sites. This type of analysis allows easy comparisons among experiments. No particular model structure is assumed, however it is assumed that the tracer is given by bolus injection and that both tissue uptake and the plasma concentration of unchanged tracer are monitored following tracer injection. The requirement of plasma measurements can be eliminated in some cases when a reference region is available. There are two categories of graphical methods which apply to two general types of ligands--those which bind reversibly during the scanning procedure and those which are irreversible or trapped during the time of the scanning procedure.

  7. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    NASA Astrophysics Data System (ADS)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  8. Effect of salt solutions applied during wheat conditioning on lipase activity and lipid stability of whole wheat flour.

    PubMed

    Doblado-Maldonado, Andrés F; Arndt, Elizabeth A; Rose, Devin J

    2013-09-01

    Lipolytic activity in whole wheat flour (WWF) is largely responsible for the loss in baking quality during storage. Metal ions affect the activity of seed lipases; however, no previous studies have applied this information to WWF in a way that reduces lipase activity, is practical for commercial manufacture, and uses common food ingredients. NaCl, KCl, Ca-propionate, or FeNa-ethylenediaminetetraacetic acid (FeNa-EDTA) were applied to hard red winter (HRW) and hard white spring (HWS) wheats during conditioning as aqueous solutions at concentrations that would be acceptable in baked goods. Salts affected lipase activity to different degrees depending on the type of wheat used. Inhibition was greater in HRW compared with HWS WWF, probably due to higher lipase activity in HRW wheat. In HRW WWF, 1% NaCl (flour weight) reduced hydrolytic and oxidative rancidity and resulted in higher loaf volume and lower firmness than untreated WWF after 24 weeks of storage.

  9. Determination of the acute toxicities of physicochemical pretreatment and advanced oxidation processes applied to dairy effluents on activated sludge.

    PubMed

    Sivrioğlu, Özge; Yonar, Taner

    2015-04-01

    In this study, the acute toxicities of raw, physicochemical pre-treated, ozonated, and Fenton reagent applied samples of dairy wastewater toward activated sludge microorganisms, evaluated using the International Organization for Standardization's respiration inhibition test (ISO 8192), are presented. Five-day biological oxygen demand (BOD5) was measured to determine the biodegradability of physicochemical treatment, ozonation, Fenton oxidation or no treatment (raw samples) of dairy wastewater. Chemical pretreatment positively affected biodegradability, and the inhibition exhibited by activated sludge was removed to a considerable degree. Ozonation and the Fenton process exhibited good chemical oxygen demand removal (61%) and removal of toxins. Low sludge production was observed for the Fenton process applied to dairy effluents. We did not determine the inhibitory effect of the Fenton-process on the activated sludge mixture. The pollutant-removal efficiencies of the applied processes and their associated operating costs were determined.

  10. Limit Cycle Analysis Applied to the Oscillations of Decelerating Blunt-Body Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Queen, Eric M.

    2008-01-01

    Many blunt-body entry vehicles have nonlinear dynamic stability characteristics that produce self-limiting oscillations in flight. Several different test techniques can be used to extract dynamic aerodynamic coefficients to predict this oscillatory behavior for planetary entry mission design and analysis. Most of these test techniques impose boundary conditions that alter the oscillatory behavior from that seen in flight. Three sets of test conditions, representing three commonly used test techniques, are presented to highlight these effects. Analytical solutions to the constant-coefficient planar equations-of-motion for each case are developed to show how the same blunt body behaves differently depending on the imposed test conditions. The energy equation is applied to further illustrate the governing dynamics. Then, the mean value theorem is applied to the energy rate equation to find the effective damping for an example blunt body with nonlinear, self-limiting dynamic characteristics. This approach is used to predict constant-energy oscillatory behavior and the equilibrium oscillation amplitudes for the various test conditions. These predictions are verified with planar simulations. The analysis presented provides an overview of dynamic stability test techniques and illustrates the effects of dynamic stability, static aerodynamics and test conditions on observed dynamic motions. It is proposed that these effects may be leveraged to develop new test techniques and refine test matrices in future tests to better define the nonlinear functional forms of blunt body dynamic stability curves.

  11. Lévy scaling: the diffusion entropy analysis applied to DNA sequences.

    PubMed

    Scafetta, Nicola; Latora, Vito; Grigolini, Paolo

    2002-09-01

    We address the problem of the statistical analysis of a time series generated by complex dynamics with the diffusion entropy analysis (DEA) [N. Scafetta, P. Hamilton, and P. Grigolini, Fractals 9, 193 (2001)]. This method is based on the evaluation of the Shannon entropy of the diffusion process generated by the time series imagined as a physical source of fluctuations, rather than on the measurement of the variance of this diffusion process, as done with the traditional methods. We compare the DEA to the traditional methods of scaling detection and prove that the DEA is the only method that always yields the correct scaling value, if the scaling condition applies. Furthermore, DEA detects the real scaling of a time series without requiring any form of detrending. We show that the joint use of DEA and variance method allows to assess whether a time series is characterized by Lévy or Gauss statistics. We apply the DEA to the study of DNA sequences and prove that their large-time scales are characterized by Lévy statistics, regardless of whether they are coding or noncoding sequences. We show that the DEA is a reliable technique and, at the same time, we use it to confirm the validity of the dynamic approach to the DNA sequences, proposed in earlier work. PMID:12366151

  12. Lévy scaling: The diffusion entropy analysis applied to DNA sequences

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola; Latora, Vito; Grigolini, Paolo

    2002-09-01

    We address the problem of the statistical analysis of a time series generated by complex dynamics with the diffusion entropy analysis (DEA) [N. Scafetta, P. Hamilton, and P. Grigolini, Fractals 9, 193 (2001)]. This method is based on the evaluation of the Shannon entropy of the diffusion process generated by the time series imagined as a physical source of fluctuations, rather than on the measurement of the variance of this diffusion process, as done with the traditional methods. We compare the DEA to the traditional methods of scaling detection and prove that the DEA is the only method that always yields the correct scaling value, if the scaling condition applies. Furthermore, DEA detects the real scaling of a time series without requiring any form of detrending. We show that the joint use of DEA and variance method allows to assess whether a time series is characterized by Lévy or Gauss statistics. We apply the DEA to the study of DNA sequences and prove that their large-time scales are characterized by Lévy statistics, regardless of whether they are coding or noncoding sequences. We show that the DEA is a reliable technique and, at the same time, we use it to confirm the validity of the dynamic approach to the DNA sequences, proposed in earlier work.

  13. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment.

    PubMed

    Yoo, Doo Han; Lee, Jae Shin

    2016-07-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.

  14. Testing and Analysis Validation of a Metallic Repair Applied to a PRSEUS Tension Panel

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Jegley, Dawn C.

    2013-01-01

    A design and analysis of a repair concept applicable to a stiffened composite panel based on the Pultruded Rod Stitched Efficient Unitized Structure was recently completed. The damage scenario considered was a midbay-to-midbay saw-cut with a severed stiffener, flange and skin. Advanced modeling techniques such as mesh-independent definition of compliant fasteners and elastic-plastic material properties for metal parts were utilized in the finite element analysis supporting the design effort. A bolted metallic repair was selected so that it could be easily applied in the operational environment. The present work describes results obtained from a tension panel test conducted to validate both the repair concept and finite element analysis techniques used in the design effort. The test proved that the proposed repair concept is capable of sustaining load levels that are higher than those resulting from the current working stress allowables. This conclusion enables upward revision of the stress allowables that had been kept at an overly-conservative level due to concerns associated with repairability of the panels. Correlation of test data with finite element analysis results is also presented and assessed.

  15. Symmetry analysis for nonlinear time reversal methods applied to nonlinear acoustic imaging

    NASA Astrophysics Data System (ADS)

    Dos Santos, Serge; Chaline, Jennifer

    2015-10-01

    Using symmetry invariance, nonlinear Time Reversal (TR) and reciprocity properties, the classical NEWS methods are supplemented and improved by new excitations having the intrinsic property of enlarging frequency analysis bandwidth and time domain scales, with now both medical acoustics and electromagnetic applications. The analysis of invariant quantities is a well-known tool which is often used in nonlinear acoustics in order to simplify complex equations. Based on a fundamental physical principle known as symmetry analysis, this approach consists in finding judicious variables, intrinsically scale dependant, and able to describe all stages of behaviour on the same theoretical foundation. Based on previously published results within the nonlinear acoustic areas, some practical implementation will be proposed as a new way to define TR-NEWS based methods applied to NDT and medical bubble based non-destructive imaging. This paper tends to show how symmetry analysis can help us to define new methodologies and new experimental set-up involving modern signal processing tools. Some example of practical realizations will be proposed in the context of biomedical non-destructive imaging using Ultrasound Contrast Agents (ACUs) where symmetry and invariance properties allow us to define a microscopic scale-invariant experimental set-up describing intrinsic symmetries of the microscopic complex system.

  16. Multivariate Curve Resolution Applied to Hyperspectral Imaging Analysis of Chocolate Samples.

    PubMed

    Zhang, Xin; de Juan, Anna; Tauler, Romà

    2015-08-01

    This paper shows the application of Raman and infrared hyperspectral imaging combined with multivariate curve resolution (MCR) to the analysis of the constituents of commercial chocolate samples. The combination of different spectral data pretreatment methods allowed decreasing the high fluorescent Raman signal contribution of whey in the investigated chocolate samples. Using equality constraints during MCR analysis, estimations of the pure spectra of the chocolate sample constituents were improved, as well as their relative contributions and their spatial distribution on the analyzed samples. In addition, unknown constituents could be also resolved. White chocolate constituents resolved from Raman hyperspectral image indicate that, at macro scale, sucrose, lactose, fat, and whey constituents were intermixed in particles. Infrared hyperspectral imaging did not suffer from fluorescence and could be applied for white and milk chocolate. As a conclusion of this study, micro-hyperspectral imaging coupled to the MCR method is confirmed to be an appropriate tool for the direct analysis of the constituents of chocolate samples, and by extension, it is proposed for the analysis of other mixture constituents in commercial food samples.

  17. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment

    PubMed Central

    Yoo, Doo Han; Lee, Jae Shin

    2016-01-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders. PMID:27512283

  18. Common reduced spaces of representation applied to multispectral texture analysis in cosmetology

    NASA Astrophysics Data System (ADS)

    Corvo, Joris; Angulo, Jesus; Breugnot, Josselin; Borbes, Sylvie; Closs, Brigitte

    2016-03-01

    Principal Component Analysis (PCA) is a technique of multivariate data analysis widely used in various fields like biology, ecology or economy to reduce data dimensionality while retaining most important information. It is becoming a standard practice in multispectral/hyperspectral imaging since those multivariate data generally suffer from a high redundancy level. Nevertheless, by definition, PCA is meant to be applied to a single multispectral/hyperspectral image at a time. When several images have to be treated, running a PCA on each image would generate specific reduced spaces, which is not suitable for comparison between results. Thus, we focus on two PCA based algorithms that could define common reduced spaces of representation. The first method arises from literature and is computed with the barycenter covariance matrix. On the contrary, we designed the second algorithm with the idea of correcting standard PCA using permutations and inversions of eigenvectors. These dimensionality reduction methods are used within the context of a cosmetological study of a foundation make-up. Available data are in-vivo multispectral images of skin acquired on different volunteers in time series. The main purpose of this study is to characterize the make-up degradation especially in terms of texture analysis. Results have to be validate by statistical prediction of time since applying the product. PCA algorithms produce eigenimages that separately enhance skin components (pores, radiance, vessels...). From these eigenimages, we extract morphological texture descriptors and intent a time prediction. Accuracy of common reduced spaces outperform classical PCA one. In this paper, we detail how PCA is extended to the multiple groups case and explain what are the advantages of common reduced spaces when it comes to study several multispectral images.

  19. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration

    PubMed Central

    Eeren, Hester V.; Schawo, Saskia J.; Scholte, Ron H. J.; Busschbach, Jan J. V.; Hakkaart, Leona

    2015-01-01

    Objectives To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Methods Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. Results At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Conclusions Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents. PMID:26146831

  20. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance. PMID:27602357

  1. NextGen Brain Microdialysis: Applying Modern Metabolomics Technology to the Analysis of Extracellular Fluid in the Central Nervous System

    PubMed Central

    Kao, Chi-Ya; Anderzhanova, Elmira; Asara, John M.; Wotjak, Carsten T.; Turck, Christoph W.

    2015-01-01

    Microdialysis is a powerful method for in vivo neurochemical analyses. It allows fluid sampling in a dynamic manner in specific brain regions over an extended period of time. A particular focus has been the neurochemical analysis of extracellular fluids to explore central nervous system functions. Brain microdialysis recovers neurotransmitters, low-molecular-weight neuromodulators and neuropeptides of special interest when studying behavior and drug effects. Other small molecules, such as central metabolites, are typically not assessed despite their potential to yield important information related to brain metabolism and activity in selected brain regions. We have implemented a liquid chromatography online mass spectrometry metabolomics platform for an expanded analysis of mouse brain microdialysates. The method is sensitive and delivers information for a far greater number of analytes than commonly used electrochemical and fluorescent detection or biochemical assays. The metabolomics platform was applied to the analysis of microdialysates in a foot shock-induced mouse model of posttraumatic stress disorder (PTSD). The rich metabolite data information was then used to delineate affected prefrontal molecular pathways that reflect individual susceptibility for developing PTSD-like symptoms. We demonstrate that hypothesis-free metabolomics can be adapted to the analysis of microdialysates for the discovery of small molecules with functional significance.

  2. DC resistivity tomography applied to monitoring active layer environments below patterned ground in Svalbard

    NASA Astrophysics Data System (ADS)

    Watanabe, Tatsuya; Juliussen, Hâvard; Matsuoka, Norikazu; Christiansen, Hanne H.

    2010-05-01

    Patterned ground is one of the most characteristic features in arctic periglacial landscapes that originated from various periglacial processes. On flat tundra surfaces composed of fine-grained soils, ice-wedge polygons are dominant, but mud boils and hummocks are also developed. Their distribution is constrained by local ground material, hydrology, snow cover, vegetation and freeze/thaw regimes. Whereas there have been a large number of studies on patterned ground phenomena, environmental factors distinguishing the types of patterned ground are not well understood. We applied DC resistivity tomography to understanding hydrological characteristics and freeze/thaw dynamics at adjoining ice-wedge and mud-boil sites in Adventdalen, Svalbard, where comprehensive periglacial process monitoring has been undertaken. Electrode arrays consisting of 81 nails spaced at 20 cm intervals were fixed at each site early in June 2009 immediately after the snow cover disappeared. The nails were stuck within the top 5 cm to resolve the top layer of the ground. Measurements were carried out repeatedly at approximately two week intervals. Spring results from both sites are characterized by an increase in resistivity near surface due to drying up. This tendency is prominent in the ice-wedge polygon centre where standing water remains until late spring. Time-lapse analyses indicate a distinct decrease in resistivity in seasonal frozen layer at both sites probably due to an increase in unfrozen water content by downward heat transfer. Summer profiles from both sites display a distinct resistivity boundary propagating downward with time, corresponding well with the thaw depth measured by mechanical probing. These data also show near-surface high resistivity spots indicating the location of desiccation cracks. Profiles from the mud-boil site show higher resistivity in the thaw layer than those of ice-wedge site, implying different drainage condition between them. After seasonal freezing

  3. Human hair neutron activation analysis: Analysis on population level, mapping

    NASA Astrophysics Data System (ADS)

    Zhuk, L. I.; Kist, A. A.

    1999-01-01

    Neutron activation analysis is an outstanding analytical method having very wide applications in various fields. Analysis of human hair within last decades mostly based on neutron activation analysis is a very attractive illustration of the application of nuclear analytical techniques. Very interesting question is how the elemental composition differs in different areas or cities. In this connection the present paper gives average data and maps of various localities in the vicinity of drying-out Aral Sea and of various industrial cities in Central Asia.

  4. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  5. Characteristic analysis of the lower limb muscular strength training system applied with MR dampers.

    PubMed

    Yu, Chang Ho; Piao, Young Jun; Kim, Kyung; Kwon, Tae Kyu

    2014-01-01

    A new training system that can adjust training intensity and indicate the center pressure of a subject was proposed by applying controlled electric current to the Magneto-Rheological damper. The experimental studying on the muscular activities were performed in lower extremities during maintaining and moving exercises, which were processed on an unstable platform with Magneto rheological dampers and recorded in a monitor. The electromyography (EMG) signals of the eight muscles in lower extremities were recorded and analyzed in certain time and frequency domain. Muscles researched in this paper were rectus femoris (RF), biceps femoris (BF), tensor fasciae latae (TFL), vastuslateralis (VL), vastusmedialis (VM), gastrocnemius (Ga), tibialis anterior (TA), and soleus (So). Differences of muscular activities during four moving exercises were studied in our experimental results. The rate of the increment of the muscular activities was affected by the condition of the unstable platform with MR dampers, which suggested the difference of moving exercises could selectively train each muscle with varying intensities. Furthermore, these findings also proposed that this training system can improve the ability of postural balance.

  6. The Challenge of Finding Faculty Time for Applied Research Activities in Ontario Colleges

    ERIC Educational Resources Information Center

    Rosenkrantz, Otte

    2013-01-01

    The purpose of this study was to explore how the role of Ontario college faculty has evolved since the advent of the Post-Secondary Education Choice and Excellence Act of 2000 and the Colleges of Applied Arts and Technology Act of 2002 in terms of whether or not the decision to create a research culture at the colleges included making time…

  7. Restricted Modal Analysis Applied to Internal Annular Combustor Autospectra and Cross-Spectra Measurements

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2007-01-01

    A treatment of the modal decomposition of the pressure field in a combustor as determined by two pressure time history measurements is developed herein. It is applied to a Pratt and Whitney PW4098 engine combustor over a range of operating conditions. For modes other than the plane wave the assumption is made that there are distinct frequency bands in which the individual modes, including the plane wave mode, overlap such that if circumferential mode m and circumferential mode m-1 are present then circumferential mode m-2 is not. In the analysis used herein at frequencies above the first cutoff mode frequency, only pairs of circumferential modes are individually present at each frequency. Consequently, this is a restricted modal analysis. As part of the analysis one specifies mode cut-on frequencies. This creates a set of frequencies that each mode spans. One finding was the successful use of the same modal span frequencies over a range of operating conditions for this particular engine. This suggests that for this case the cut-on frequencies are in proximity at each operating condition. Consequently, the combustion noise spectrum related to the circumferential modes might not change much with operating condition.

  8. Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.

    PubMed

    de Viguerie, Laurence; Sole, V Armando; Walter, Philippe

    2009-12-01

    X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.

  9. On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2016-03-01

    Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.

  10. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. PMID:21144549

  11. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 3 2012-10-01 2012-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  12. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  13. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 3 2011-10-01 2011-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  14. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 3 2014-10-01 2014-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  15. 30 CFR 285.428 - What effect does applying for a renewal have on my activities and payments?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What effect does applying for a renewal have on my activities and payments? 285.428 Section 285.428 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY ALTERNATE...

  16. 30 CFR 585.428 - What effect does applying for a renewal have on my activities and payments?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 2 2012-07-01 2012-07-01 false What effect does applying for a renewal have on my activities and payments? 585.428 Section 585.428 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING FACILITIES...

  17. 30 CFR 585.428 - What effect does applying for a renewal have on my activities and payments?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 2 2013-07-01 2013-07-01 false What effect does applying for a renewal have on my activities and payments? 585.428 Section 585.428 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING FACILITIES...

  18. A Proposal of "Applied Social Activities" Module for Undergraduate Program of Turkish Language and Literature Teachers: A Qualitative Study

    ERIC Educational Resources Information Center

    Sarac, Cemal

    2011-01-01

    In this study, it was aimed to determine the views on the contributions provided for the prospective teachers with the course of Applied Social Activities suggested to be in the undergraduate program of the Turkish Language and Literature Teaching. A qualitative research method was used in order to evaluate the views of prospective teachers about…

  19. 24 CFR 1000.40 - Do lead-based paint poisoning prevention requirements apply to affordable housing activities...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Do lead-based paint poisoning... AMERICAN HOUSING ACTIVITIES General § 1000.40 Do lead-based paint poisoning prevention requirements apply..., subparts A, B, H, J, K, M and R of this title, which implement the Lead-Based Paint Poisoning...

  20. 24 CFR 1000.40 - Do lead-based paint poisoning prevention requirements apply to affordable housing activities...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Do lead-based paint poisoning... AMERICAN HOUSING ACTIVITIES General § 1000.40 Do lead-based paint poisoning prevention requirements apply..., subparts A, B, H, J, K, M and R of this title, which implement the Lead-Based Paint Poisoning...

  1. 24 CFR 1000.40 - Do lead-based paint poisoning prevention requirements apply to affordable housing activities...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Do lead-based paint poisoning... AMERICAN HOUSING ACTIVITIES General § 1000.40 Do lead-based paint poisoning prevention requirements apply..., subparts A, B, H, J, K, M and R of this title, which implement the Lead-Based Paint Poisoning...

  2. 24 CFR 1000.40 - Do lead-based paint poisoning prevention requirements apply to affordable housing activities...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Do lead-based paint poisoning... AMERICAN HOUSING ACTIVITIES General § 1000.40 Do lead-based paint poisoning prevention requirements apply..., subparts A, B, H, J, K, M and R of this title, which implement the Lead-Based Paint Poisoning...

  3. 24 CFR 1000.40 - Do lead-based paint poisoning prevention requirements apply to affordable housing activities...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Do lead-based paint poisoning... AMERICAN HOUSING ACTIVITIES General § 1000.40 Do lead-based paint poisoning prevention requirements apply..., subparts A, B, H, J, K, M and R of this title, which implement the Lead-Based Paint Poisoning...

  4. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  5. Soil microbial activity is affected by Roundup WeatherMax and pesticides applied to cotton (Gossypium hirsutum)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Adoption of glyphosate-based weed control systems has led to increased use of the herbicide with continued use of additional pesticides. Combinations of pesticides may affect soil microbial activity differently than pesticides applied alone. Research was conducted to evaluate the influence of glypho...

  6. 44 CFR 4.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false What procedures apply to the selection of programs and activities under these regulations? 4.6 Section 4.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL...

  7. 44 CFR 4.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false What procedures apply to the selection of programs and activities under these regulations? 4.6 Section 4.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL...

  8. 44 CFR 4.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true What procedures apply to the selection of programs and activities under these regulations? 4.6 Section 4.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL...

  9. 44 CFR 4.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false What procedures apply to the selection of programs and activities under these regulations? 4.6 Section 4.6 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL...

  10. 20 CFR 670.965 - What procedures apply to disclosure of information about Job Corps students and program activities?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to requests for information or records or other necessary disclosures pertaining to students. (b) DOL... Information Act or 29 CFR part 70. (d) The regulations at 29 CFR part 71 apply to a system of records covered... information about Job Corps students and program activities? 670.965 Section 670.965 Employees'...

  11. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  12. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  13. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  14. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  15. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  16. 25 CFR 1000.406 - Does Indian preference apply to services, activities, programs, and functions performed under a...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Does Indian preference apply to services, activities, programs, and functions performed under a self-governance AFA? 1000.406 Section 1000.406 Indians OFFICE OF... functions performed under a self-governance AFA? Tribal law must govern Indian preference in...

  17. 25 CFR 1000.406 - Does Indian preference apply to services, activities, programs, and functions performed under a...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false Does Indian preference apply to services, activities, programs, and functions performed under a self-governance AFA? 1000.406 Section 1000.406 Indians OFFICE OF... functions performed under a self-governance AFA? Tribal law must govern Indian preference in...

  18. 25 CFR 1000.406 - Does Indian preference apply to services, activities, programs, and functions performed under a...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Does Indian preference apply to services, activities, programs, and functions performed under a self-governance AFA? 1000.406 Section 1000.406 Indians OFFICE OF... functions performed under a self-governance AFA? Tribal law must govern Indian preference in...

  19. 25 CFR 1000.406 - Does Indian preference apply to services, activities, programs, and functions performed under a...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false Does Indian preference apply to services, activities, programs, and functions performed under a self-governance AFA? 1000.406 Section 1000.406 Indians OFFICE OF... functions performed under a self-governance AFA? Tribal law must govern Indian preference in...

  20. 25 CFR 1000.406 - Does Indian preference apply to services, activities, programs, and functions performed under a...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false Does Indian preference apply to services, activities, programs, and functions performed under a self-governance AFA? 1000.406 Section 1000.406 Indians OFFICE OF... functions performed under a self-governance AFA? Tribal law must govern Indian preference in...

  1. User Activity Recognition in Smart Homes Using Pattern Clustering Applied to Temporal ANN Algorithm.

    PubMed

    Bourobou, Serge Thomas Mickala; Yoo, Younghwan

    2015-01-01

    This paper discusses the possibility of recognizing and predicting user activities in the IoT (Internet of Things) based smart environment. The activity recognition is usually done through two steps: activity pattern clustering and activity type decision. Although many related works have been suggested, they had some limited performance because they focused only on one part between the two steps. This paper tries to find the best combination of a pattern clustering method and an activity decision algorithm among various existing works. For the first step, in order to classify so varied and complex user activities, we use a relevant and efficient unsupervised learning method called the K-pattern clustering algorithm. In the second step, the training of smart environment for recognizing and predicting user activities inside his/her personal space is done by utilizing the artificial neural network based on the Allen's temporal relations. The experimental results show that our combined method provides the higher recognition accuracy for various activities, as compared with other data mining classification algorithms. Furthermore, it is more appropriate for a dynamic environment like an IoT based smart home. PMID:26007738

  2. User Activity Recognition in Smart Homes Using Pattern Clustering Applied to Temporal ANN Algorithm.

    PubMed

    Bourobou, Serge Thomas Mickala; Yoo, Younghwan

    2015-01-01

    This paper discusses the possibility of recognizing and predicting user activities in the IoT (Internet of Things) based smart environment. The activity recognition is usually done through two steps: activity pattern clustering and activity type decision. Although many related works have been suggested, they had some limited performance because they focused only on one part between the two steps. This paper tries to find the best combination of a pattern clustering method and an activity decision algorithm among various existing works. For the first step, in order to classify so varied and complex user activities, we use a relevant and efficient unsupervised learning method called the K-pattern clustering algorithm. In the second step, the training of smart environment for recognizing and predicting user activities inside his/her personal space is done by utilizing the artificial neural network based on the Allen's temporal relations. The experimental results show that our combined method provides the higher recognition accuracy for various activities, as compared with other data mining classification algorithms. Furthermore, it is more appropriate for a dynamic environment like an IoT based smart home.

  3. Brief Report: The Theory of Planned Behaviour Applied to Physical Activity in Young People Who Smoke

    ERIC Educational Resources Information Center

    Everson, Emma S.; Daley, Amanda J.; Ussher, Michael

    2007-01-01

    It has been hypothesised that physical activity may be useful as a smoking cessation intervention for young adults. In order to inform such interventions, this study evaluated the theory of planned behaviour (TPB) for understanding physical activity behaviour in young smokers. Regular smokers aged 16-19 years (N=124), self-reported physical…

  4. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients. PMID:26099261

  5. Analysis of the electric currents in 1D premixed flames under applied voltages

    NASA Astrophysics Data System (ADS)

    Han, Jie; Belhi, Memdouh; Bisetti, Fabrizio; Casey, Tiernan; Im, Hong G.; Chen, Jyh-Yuan

    2015-11-01

    Studying electric currents in flames has practical aspects such as the determination of the ionic structure of a flame, the analysis of the flame behavior under an electric field and the use of flame electric properties for combustion diagnostics. This study proposes a simplified model to compute the electric currents in lean-to-stoichiometric 1D premixed flames under applied voltages. The Navier-Stokes equations coupled with transport equations for neutral and charged species along with a Poisson equation for the electric potential are solved. The model reproduces qualitatively the voltage-current characteristic found experimentally. The sensitivity of the electric currents to the applied voltage, equivalence ratio, and pressure is studied and the key parameters affecting the saturation current are determined. Results show that the saturation current is controlled by the amount of charged species created by the chemi-ionization reaction. We found that the recombination rate of electrons with cations and transport coefficients of charged species are the most important parameters affecting the voltage at witch saturation occurs. Analytical formulas for the voltage-current characteristic and the potential of saturation are developed and used to explain the obtained results.

  6. Independent comparison study of six different electronic tongues applied for pharmaceutical analysis.

    PubMed

    Pein, Miriam; Kirsanov, Dmitry; Ciosek, Patrycja; del Valle, Manel; Yaroshenko, Irina; Wesoły, Małgorzata; Zabadaj, Marcin; Gonzalez-Calabuig, Andreu; Wróblewski, Wojciech; Legin, Andrey

    2015-10-10

    Electronic tongue technology based on arrays of cross-sensitive chemical sensors and chemometric data processing has attracted a lot of researchers' attention through the last years. Several so far reported applications dealing with pharmaceutical related tasks employed different e-tongue systems to address different objectives. In this situation, it is hard to judge on the benefits and drawbacks of particular e-tongue implementations for R&D in pharmaceutics. The objective of this study was to compare the performance of six different e-tongues applied to the same set of pharmaceutical samples. For this purpose, two commercially available systems (from Insent and AlphaMOS) and four laboratory prototype systems (two potentiometric systems from Warsaw operating in flow and static modes, one potentiometric system from St. Petersburg, one voltammetric system from Barcelona) were employed. The sample set addressed in the study comprised nine different formulations based on caffeine citrate, lactose monohydrate, maltodextrine, saccharin sodium and citric acid in various combinations. To provide for the fair and unbiased comparison, samples were evaluated under blind conditions and data processing from all the systems was performed in a uniform way. Different mathematical methods were applied to judge on similarity of the e-tongues response from the samples. These were principal component analysis (PCA), RV' matrix correlation coefficients and Tuckeŕs congruency coefficients.

  7. Applying different independent component analysis algorithms and support vector regression for IT chain store sales forecasting.

    PubMed

    Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.

  8. [Risk Analysis applied to food safety in Brazil: prospects and challenges].

    PubMed

    Figueiredo, Ana Virgínia de Almeida; Miranda, Maria Spínola

    2011-04-01

    The scope of this case study is to discuss the ideas of the Brazilian Codex Alimentarius Committee (CCAB) coordinated by National Institute of Metrology, Standardization and Industrial Quality (Inmetro), with respect to the Codex Alimentarius norm on Risk Analysis (RA) applied to Food Safety. The objectives of this investigation were to identify and analyze the opinion of CCAB members on RA and to register their proposals for the application of this norm in Brazil, highlighting the local limitations and potential detected. CCAB members were found to be in favor of the Codex Alimentarius initiative of instituting an RA norm to promote the health safety of foods that circulate on the international market. There was a consensus that the Brazilian government should incorporate RA as official policy to improve the country's system of food control and leverage Brazilian food exports. They acknowledge that Brazil has the technical-scientific capacity to apply this norm, though they stressed several political and institutional limitations. The members consider RA to be a valid initiative for tackling risks in food, due to its ability to improve food safety control measures adopted by the government.

  9. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    PubMed Central

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  10. Raman spectroscopy and capillary electrophoresis applied to forensic colour inkjet printer inks analysis.

    PubMed

    Król, Małgorzata; Karoly, Agnes; Kościelniak, Paweł

    2014-09-01

    Forensic laboratories are increasingly engaged in the examination of fraudulent documents, and what is important, in many cases these are inkjet-printed documents. That is why systematic approaches to inkjet printer inks comparison and identification have been carried out by both non-destructive and destructive methods. In this study, micro-Raman spectroscopy and capillary electrophoresis (CE) were applied to the analysis of colour inkjet printer inks. Micro-Raman spectroscopy was used to study the chemical composition of colour inks in situ on a paper surface. It helps to characterize and differentiate inkjet inks, and can be used to create a spectra database of inks taken from different cartridge brands and cartridge numbers. Capillary electrophoresis in micellar electrophoretic capillary chromatography mode was applied to separate colour and colourless components of inks, enabling group identification of those components which occur in a sufficient concentration (giving intensive peaks). Finally, on the basis of the obtained results, differentiation of the analysed inks was performed. Twenty-three samples of inkjet printer inks were examined and the discriminating power (DP) values for both presented methods were established in the routine work of experts during the result interpretation step. DP was found to be 94.0% (Raman) and 95.6% (CE) when all the analysed ink samples were taken into account, and it was 96.7% (Raman) and 98.4% (CE), when only cartridges with different index numbers were considered.

  11. Applying high resolution SyXRD analysis on sulfate attacked concrete field samples

    SciTech Connect

    Stroh, J.; Schlegel, M.-C.; Irassar, E.F.; Meng, B.; Emmerling, F.

    2014-12-15

    High resolution synchrotron X-ray diffraction (SyXRD) was applied for a microstructural profile analysis of concrete deterioration after sulfate attack. The cement matrices consist of ordinary Portland cement and different amounts of supplementary cementitious materials, such as fly ash, natural pozzolana and granulated blast furnace slag. The changes of the phase composition were determined along the direction of sulfate ingress. This approach allows the identification of reaction fronts and zones of different phase compositions and conclusions about the mechanisms of sulfate attack. Two reaction fronts were localized in the initial 4 mm from the sample surface. The mechanism of deterioration caused by the exposition in the sulfate-bearing soil is discussed. SyXRD is shown to be a reliable method for investigation of cementitious materials with aggregates embedded in natural environments.

  12. Applying machine learning techniques to DNA sequence analysis. Progress report, February 14, 1991--February 13, 1992

    SciTech Connect

    Shavlik, J.W.

    1992-04-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a ``domain theory``), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  13. The accident evolution and barrier function (AEB) model applied to incident analysis in the processing industries.

    PubMed

    Svenson, O

    1991-09-01

    This study develops a theoretical model for accident evolutions and how they can be arrested. The model describes the interaction between technical and human-organizational systems which may lead to an accident. The analytic tool provided by the model gives equal weight to both these types of systems and necessitates simultaneous and interactive accident analysis by engineers and human factors specialists. It can be used in predictive safety analyses as well as in post hoc incident analyses. To illustrate this, the AEB model is applied to an incident reported by the nuclear industry in Sweden. In general, application of the model will indicate where and how safety can be improved, and it also raises questions about issues such as the cost, feasibility, and effectiveness of different ways of increasing safety.

  14. Multivariate analysis applied to agglomerated macrobenthic data from an unpolluted estuary.

    PubMed

    Conde, Anxo; Novais, Júlio M; Domínguez, Jorge

    2013-01-01

    We agglomerated species into higher taxonomic aggregations and functional groups to analyse environmental gradients in an unpolluted estuary. We then applied non-metric Multidimensional Scaling and Redundancy Analysis (RDA) for ordination of the agglomerated data matrices. The correlation between the ordinations produced by both methods was generally high. However, the performance of the RDA models depended on the data matrix used to fit the model. As a result, salinity and total nitrogen were only found significant when aggregated data matrices were used rather than species data matrix. We used the results to select a RDA model that explained a higher percentage of variance in the species data set than the parsimonious model. We conclude that the use of aggregated matrices may be considered complementary to the use of species data to obtain a broader insight into the distribution of macrobenthic assemblages in relation to environmental gradients. PMID:23684322

  15. Brief report: the theory of planned behaviour applied to physical activity in young people who smoke.

    PubMed

    Everson, Emma S; Daley, Amanda J; Ussher, Michael

    2007-04-01

    It has been hypothesised that physical activity may be useful as a smoking cessation intervention for young adults. In order to inform such interventions, this study evaluated the theory of planned behaviour (TPB) for understanding physical activity behaviour in young smokers. Regular smokers aged 16-19 years (N=124), self-reported physical activity and all TPB components. Physical activity behaviour was significantly explained by both intention and perceived behavioural control (PBC), with both intention and PBC making significant contributions to the model. Intention was significantly explained by attitude, subjective norm (SN) and PBC, with attitude, SN and PBC all making significant contributions to the model. The TPB may be a useful framework for guiding physical activity interventions among young smokers.

  16. A User-Based Response System for the Applied Research Needs of Comprehensive Outcomes Assessment Activities.

    ERIC Educational Resources Information Center

    Chatman, Steve; Sprengel, Archie

    The development of a computer-based decision support system for outcomes assessment at Southeast Missouri State University is described. A menu-driven, user-based retrieval and analysis and reporting system, the Statistical Analysis System (SAS) was designed. The decision support system incorporates two major components, research group…

  17. Optical Image Analysis Applied to Pore Network Quantification of Sandstones Under Experimental CO2 Injection

    NASA Astrophysics Data System (ADS)

    Berrezueta, E.; González, L.; Ordóñez, B.; Luquot, L.; Quintana, L.; Gallastegui, G.; Martínez, R.; Olaya, P.; Breitner, D.

    2015-12-01

    This research aims to propose a protocol for pore network quantification in sandstones applying the Optical Image Analysis (OIA) procedure, which guarantees the measurement reproducibility and its reliability. Two geological formations of sandstone, located in Spain and potentially suitable for CO2 sequestration, were selected for this study: a) the Cretaceous Utrillas unit, at the base of the Cenozoic Duero Basin and b) a Triassic unit at the base of the Cenozoic Guadalquivir Basin. Sandstone samples were studied before and after the CO2 experimental injection using Optical and scanning electronic microscopy (SEM), while the quantification of petrographic changes was done with OIA. The first phase of the rersearch consisted on a detailed mineralogical and petrographic study of the sandstones (before and after CO2-injection), for which we observed thin sections. Later, the methodological and experimental processes of the investigation were focused on i) adjustment and calibration of OIA tools; ii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers), using 7 images of the same mineral scene (6 in crossed polarizer and 1 in parallel polarizer); and iii) automated identification and segmentation of pore in 2D mineral images, generating applications by executable macros. Finally, once the procedure protocols had been, the compiled data was interpreted through an automated approach and the qualitative petrography was carried out. The quantification of changes in the pore network through OIA (porosity increase ≈ 2.5%) has allowed corroborate the descriptions obtained by SEM and microscopic techniques, which consisted in an increase in the porosity when CO2 treatment occurs. Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. This research offers numerical

  18. Moments applied in the rotation of massive objects in Shuttle extravehicular activity

    NASA Technical Reports Server (NTRS)

    Cousins, D.; Akin, D. L.

    1989-01-01

    Experimentally derived applied moments are presented for Space Shuttle crew EVA mission rotations of objects more massive than the human body. These levels appear to be small fractions of physiological limits; horizontal and vertical shoulder strength limits greater than 50 Nm have been established for foot-restrained, pressure-suited subjects in simulated weightlessness. The reduced level in operational EVA may be due to unfamiliarity with manual control in true weightlessness.

  19. Applying Active Learning to Assertion Classification of Concepts in Clinical Text

    PubMed Central

    Chen, Yukun; Mani, Subramani; Xu, Hua

    2012-01-01

    Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105

  20. Physical activity and sedentary behaviour: applying lessons to chronic obstructive pulmonary disease.

    PubMed

    Hill, K; Gardiner, P A; Cavalheri, V; Jenkins, S C; Healy, G N

    2015-05-01

    In health and disease, the benefits of regular participation in moderate to vigorous intensity physical activity are well documented. However, individuals with chronic conditions, such as those with chronic obstructive pulmonary disease (COPD), typically do very little activity at a moderate or vigorous intensity. Much of their day is instead spent in sedentary behaviour, such as sitting or reclining, which requires very little energy expenditure. This high level of time spent in sedentary behaviour can have serious health consequences, including increased risk of diabetes, cardiovascular disease and premature mortality. There is emerging evidence to suggest that participation in light intensity physical activities (e.g. standing or slow walking) may have benefits for cardio-metabolic health. Given the low aerobic capacity of individuals with moderate to severe COPD, increasing light intensity activity (through reducing sedentary time) may be a feasible additional strategy to improve health in this population, alongside traditional recommendations to increase the time spent in moderate to vigorous intensity physical activity. This review provides an overview of physical activity and sedentary behaviour, with a particular emphasis on these behaviours for people with COPD. It provides suggestions for the measurement of these behaviours within the clinical setting, as well as for interventions that may be effective at increasing physical activity and reducing sedentary behaviour in this population. PMID:25164319

  1. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    PubMed Central

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731

  2. Quantitative analysis of cardiovascular modulation in respiratory neural activity.

    PubMed

    Dick, Thomas E; Morris, Kendall F

    2004-05-01

    We propose the 'delta(2)-statistic' for assessing the magnitude and statistical significance of arterial pulse-modulated activity of single neurones and present the results of applying this tool to medullary respiratory-modulated units. This analytical tool is a modification of the eta(2)-statistic and, consequently, based on the analysis of variance. The eta(2)-statistic reflects the consistency of respiratory-modulated activity on a cycle-by-cycle basis. However, directly applying this test to activity during the cardiac cycle proved ineffective because subjects-by-treatments matrices did not contain enough 'information'. We increased information by dividing the cardiac cycle into fewer bins, excluding cycles without activity and summing activity over multiple cycles. The analysed neuronal activity was an existing data set examining the neural control of respiration and cough. Neurones were recorded in the nuclei of the solitary tracts, and in the rostral and caudal ventral respiratory groups of decerebrate, neuromuscularly blocked, ventilated cats (n= 19). Two hundred of 246 spike trains were respiratory modulated; of these 53% were inspiratory (I), 36.5% expiratory (E), 6% IE phase spanning and 4.5% EI phase spanning and responsive to airway stimulation. Nearly half (n= 96/200) of the respiratory-modulated units were significantly pulse modulated and 13 were highly modulated with delta(2) values exceeding 0.3. In 10 of these highly modulated units, eta(2) values were greater than 0.3 and all 13 had, at least, a portion of their activity during expiration. We conclude that cardiorespiratory interaction is reciprocal; in addition to respiratory-modulated activity in a subset of neuronal activity patterns controlling the cardiovascular system, pulse-modulated activity exists in a subset of neuronal activity patterns controlling the respiratory system. Thus, cardio-ventilatory coupling apparent in respiratory motor output is evident and, perhaps, derived from the

  3. Modulation by applied electric fields of Purkinje and stellate cell activity in the isolated turtle cerebellum.

    PubMed Central

    Chan, C Y; Nicholson, C

    1986-01-01

    Quasi steady-state electric fields were applied across the isolated turtle cerebellum to study the relationship between applied field, neuronal morphology and the modulation of the neuronal spike firing pattern. Spiking elements were identified electrophysiologically using extracellular recording methods and by subsequent horseradish peroxidase injection, which revealed their dendritic morphology and orientation. The electric field was precisely defined by measuring the voltage gradients induced in the cerebellum by 40 s constant-current pulses. The field was constant in the vertical (dorso-ventral) axis and zero in the horizontal plane, in agreement with theory. Neurones were modulated by applying a sinusoidal field at frequencies between 0.05 and 1.0 Hz. Modulated cells exhibited an increase in firing frequency and fell into one of four classes, depending on the direction of the field that produced the modulation. Thus neurones were excited by: ventricle-directed fields (V modulation), pia-directed fields (P modulation), both of the above (V/P modulation) or showed no consistent modulation (non-modulation). Most Purkinje somata and primary dendrites (nineteen out of twenty-eight) and most Purkinje dendrites (eighteen out of thirty), were V modulated with maximum rate proportional to the peak field intensity. The dendrites of these cells were consistently oriented toward the pia. Among the stellate cells, the lower molecular layer stellates, with dendrites extending predominantly towards the pia, were mostly (nineteen out of thirty-two) V modulated. The mid-molecular layer stellates, which showed much variability in dendritic orientation, were distributed among all four of the modulation classes. The upper molecular layer stellates, with a mostly horizontal dendritic alignment, were mainly (nine out of sixteen) non-modulated. All groups of spiking elements showed a correlation between patterns of modulation by applied fields and dendritic orientation, which

  4. Adaptive Neuro-Fuzzy Inference System Applied QSAR with Quantum Chemical Descriptors for Predicting Radical Scavenging Activities of Carotenoids.

    PubMed

    Jhin, Changho; Hwang, Keum Taek

    2015-01-01

    One of the physiological characteristics of carotenoids is their radical scavenging activity. In this study, the relationship between radical scavenging activities and quantum chemical descriptors of carotenoids was determined. Adaptive neuro-fuzzy inference system (ANFIS) applied quantitative structure-activity relationship models (QSAR) were also developed for predicting and comparing radical scavenging activities of carotenoids. Semi-empirical PM6 and PM7 quantum chemical calculations were done by MOPAC. Ionisation energies of neutral and monovalent cationic carotenoids and the product of chemical potentials of neutral and monovalent cationic carotenoids were significantly correlated with the radical scavenging activities, and consequently these descriptors were used as independent variables for the QSAR study. The ANFIS applied QSAR models were developed with two triangular-shaped input membership functions made for each of the independent variables and optimised by a backpropagation method. High prediction efficiencies were achieved by the ANFIS applied QSAR. The R-square values of the developed QSAR models with the variables calculated by PM6 and PM7 methods were 0.921 and 0.902, respectively. The results of this study demonstrated reliabilities of the selected quantum chemical descriptors and the significance of QSAR models. PMID:26474167

  5. Adaptive Neuro-Fuzzy Inference System Applied QSAR with Quantum Chemical Descriptors for Predicting Radical Scavenging Activities of Carotenoids

    PubMed Central

    Jhin, Changho; Hwang, Keum Taek

    2015-01-01

    One of the physiological characteristics of carotenoids is their radical scavenging activity. In this study, the relationship between radical scavenging activities and quantum chemical descriptors of carotenoids was determined. Adaptive neuro-fuzzy inference system (ANFIS) applied quantitative structure-activity relationship models (QSAR) were also developed for predicting and comparing radical scavenging activities of carotenoids. Semi-empirical PM6 and PM7 quantum chemical calculations were done by MOPAC. Ionisation energies of neutral and monovalent cationic carotenoids and the product of chemical potentials of neutral and monovalent cationic carotenoids were significantly correlated with the radical scavenging activities, and consequently these descriptors were used as independent variables for the QSAR study. The ANFIS applied QSAR models were developed with two triangular-shaped input membership functions made for each of the independent variables and optimised by a backpropagation method. High prediction efficiencies were achieved by the ANFIS applied QSAR. The R-square values of the developed QSAR models with the variables calculated by PM6 and PM7 methods were 0.921 and 0.902, respectively. The results of this study demonstrated reliabilities of the selected quantum chemical descriptors and the significance of QSAR models. PMID:26474167

  6. The elastic modulus correction term in creep activation energies Applied to oxide dispersion strengthened superalloy

    NASA Technical Reports Server (NTRS)

    Malu, M.; Tien, J. K.

    1975-01-01

    The effect of elastic modulus and the temperature dependence of elastic modulus on creep activation energies for an oxide dispersion strengthened nickel-base superalloy are investigated. This superalloy is commercially known as Inconel Alloy MA 753, strengthened both by gamma-prime precipitates and by yttria particles. It is shown that at intermediate temperatures, say below 1500 F, where elastic modulus is weakly dependent on temperature, the modulus correction term to creep activation energy is small. Accordingly, modulus corrections are insignificant for the superalloy considered, which shows high apparent creep activation energies at this temperature. On the contrary, at very high temperatures, the elastic modulus correction term can be significant, thus reducing the creep activation energy to that of vacancy self-diffusion. In order to obtain high-temperature creep resistance, a high-value elastic modulus with a weak dependence on temperature is required.

  7. [Applying a creative art activity in care: report on an experience with a newly admitted resident].

    PubMed

    Wen, Hsin-Sing; Wu, Hung-Lan; Lee, Hsien-Ju

    2014-10-01

    Doing creative art has been shown to increase activity, reduce anxiety, promote confidence and self-esteem, increase sense of achievement, and facilitate participation in life activities in the elderly. This article describes a nursing experience that used creative art activities to help an elderly resident adjust to the environment and living conditions at a long-term care facility. During the care period from October 20th to December 16th, 2012, we evaluated the health problems of the resident, which included anxiety, loneliness, and low self-esteem. The creative art activity was a 30-minute intervention held 1~2 times per week for a total of 13 sessions. This article reports on the positive effects of this intervention on reducing the resident's negative emotions such as anxiety and loneliness and, in the long run, promoting self-esteem and sense of achievement.

  8. Applying Socioecological Model to Improve Women’s Physical Activity: A Randomized Control Trial

    PubMed Central

    Tehrani, Hadi; Majlessi, Fershteh; Shojaeizadeh, Davoud; Sadeghi, Roya; Hasani Kabootarkhani, Marzieh

    2016-01-01

    Background: A sedentary life without sufficient physical activity is recognized as a risk factor for various diseases, and a major modifiable risk factor for noncommunicable diseases. This study was conducted to investigate the effect of intervention using socioecological model in promoting women’s physical activity in the city of Kerman, Iran. Materials and Methods: In this randomized, double-blinded, controlled study, 360 women were studied at health and medical centers of Kerman. This educational intervention was based on socioecological model and conducted on 4 levels of personal, social, organizational, and political. Data collection tool included a researcher-made questionnaire based on constructs of socioecological model and the international physical activity inventory. Results: The results indicated insignificant differences between the two groups in terms of perceived social, physical, and political support and also with regard to level of physical activity before intervention. However after the intervention and according to independent t test, significant differences were observed between two groups in perceived social, physical, and political support and also level of physical activity (P < 0.001). Furthermore, mean values of the above terms increased in the intervention group. Conclusions: According to the results, interventions based on socioecological model can positively affect women’s physical activity. PMID:27247781

  9. Applying a resources framework to analysis of the Force and Motion Conceptual Evaluation

    NASA Astrophysics Data System (ADS)

    Smith, Trevor I.; Wittmann, Michael C.

    2008-12-01

    We suggest one redefinition of common clusters of questions used to analyze student responses on the Force and Motion Conceptual Evaluation. Our goal is to propose a methodology that moves beyond an analysis of student learning defined by correct responses, either on the overall test or on clusters of questions defined solely by content. We use the resources framework theory of learning to define clusters within this experimental test that was designed without the resources framework in mind. We take special note of the contextual and representational dependence of questions with seemingly similar physics content. We analyze clusters in ways that allow the most common incorrect answers to give as much, or more, information as the correctness of responses in that cluster. We show that false positives can be found, especially on questions dealing with Newton’s third law. We apply our clustering to a small set of data to illustrate the value of comparing students’ incorrect responses which are otherwise identical on a correct or incorrect analysis. Our work provides a connection between theory and experiment in the area of survey design and the resources framework.

  10. Applying Geostatistical Analysis to Crime Data: Car-Related Thefts in the Baltic States.

    PubMed

    Kerry, Ruth; Goovaerts, Pierre; Haining, Robert P; Ceccato, Vania

    2010-01-01

    Geostatistical methods have rarely been applied to area-level offense data. This article demonstrates their potential for improving the interpretation and understanding of crime patterns using previously analyzed data about car-related thefts for Estonia, Latvia, and Lithuania in 2000. The variogram is used to inform about the scales of variation in offense, social, and economic data. Area-to-area and area-to-point Poisson kriging are used to filter the noise caused by the small number problem. The latter is also used to produce continuous maps of the estimated crime risk (expected number of crimes per 10,000 habitants), thereby reducing the visual bias of large spatial units. In seeking to detect the most likely crime clusters, the uncertainty attached to crime risk estimates is handled through a local cluster analysis using stochastic simulation. Factorial kriging analysis is used to estimate the local- and regional-scale spatial components of the crime risk and explanatory variables. Then regression modeling is used to determine which factors are associated with the risk of car-related theft at different scales.

  11. Restricted Acoustic Modal Analysis Applied to Internal Combustor Spectra and Cross-Spectra Measurements

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    A treatment of the modal decomposition of the pressure field in a combustor as determined by two Kulite pressure measurements is developed herein. It is applied to a Pratt & Whitney PW4098 engine combustor over a range of operating conditions. For modes other than the plane wave the new part of the treatment is the assumption that there are distinct frequency bands in which the individual modes, including the plane wave mode, overlap such that if circumferential mode m and circumferential mode m-1 are present than circumferential mode m 2 is not. Consequently, in the analysis used herein at frequencies above the first cut-off mode frequency, only pairs of circumferential modes are individually present at each frequency. Consequently, this is a restricted modal analysis. A new result is that the successful use of the same modal span frequencies over a range of operating conditions for this particular engine suggests that the temperature, T, and the velocity, v, of the flow at each operating condition are related by c(sup 2)-v(sup 2) = a constant where c is the speed of sound.

  12. Bayesian flux balance analysis applied to a skeletal muscle metabolic model.

    PubMed

    Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki

    2007-09-01

    In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models. PMID:17568615

  13. A study and evaluation of image analysis techniques applied to remotely sensed data

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.; Dasarathy, B. V.; Lybanon, M.; Ramapriyan, H. K.

    1976-01-01

    An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented.

  14. High-density EEG coherence analysis using functional units applied to mental fatigue.

    PubMed

    Ten Caat, Michael; Lorist, Monicque M; Bezdan, Eniko; Roerdink, Jos B T M; Maurits, Natasha M

    2008-06-30

    Electroencephalography (EEG) coherence provides a quantitative measure of functional brain connectivity which is calculated between pairs of signals as a function of frequency. Without hypotheses, traditional coherence analysis would be cumbersome for high-density EEG which employs a large number of electrodes. One problem is to find the most relevant regions and coherences between those regions in individuals and groups. Therefore, we previously developed a data-driven approach for individual as well as group analyses of high-density EEG coherence. Its data-driven regions of interest (ROIs) are referred to as functional units (FUs) and are defined as spatially connected sets of electrodes that record pairwise significantly coherent signals. Here, we apply our data-driven approach to a case study of mental fatigue. We show that our approach overcomes the severe limitations of conventional hypothesis-driven methods which depend on previous investigations and leads to a selection of coherences of interest taking full advantage of the recordings under investigation. The presented visualization of (group) FU maps provides a very economical data summary of extensive experimental results, which otherwise would be very difficult and time-consuming to assess. Our approach leads to an FU selection which may serve as a basis for subsequent conventional quantitative analysis; thus it complements rather than replaces the hypothesis-driven approach.

  15. The Langmuir isotherm: a commonly applied but misleading approach for the analysis of protein adsorption behavior.

    PubMed

    Latour, Robert A

    2015-03-01

    The Langmuir adsorption isotherm provides one of the simplest and most direct methods to quantify an adsorption process. Because isotherm data from protein adsorption studies often appear to be fit well by the Langmuir isotherm model, estimates of protein binding affinity have often been made from its use despite that fact that none of the conditions required for a Langmuir adsorption process may be satisfied for this type of application. The physical events that cause protein adsorption isotherms to often provide a Langmuir-shaped isotherm can be explained as being due to changes in adsorption-induced spreading, reorientation, clustering, and aggregation of the protein on a surface as a function of solution concentration in contrast to being due to a dynamic equilibrium adsorption process, which is required for Langmuir adsorption. Unless the requirements of the Langmuir adsorption process can be confirmed, fitting of the Langmuir model to protein adsorption isotherm data to obtain thermodynamic properties, such as the equilibrium constant for adsorption and adsorption free energy, may provide erroneous values that have little to do with the actual protein adsorption process, and should be avoided. In this article, a detailed analysis of the Langmuir isotherm model is presented along with a quantitative analysis of the level of error that can arise in derived parameters when the Langmuir isotherm is inappropriately applied to characterize a protein adsorption process.

  16. Optimizing human activity patterns using global sensitivity analysis

    PubMed Central

    Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.

    2014-01-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080

  17. Optimizing human activity patterns using global sensitivity analysis

    SciTech Connect

    Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.

    2013-12-10

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.

  18. Optimizing human activity patterns using global sensitivity analysis

    DOE PAGES

    Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.

    2013-12-10

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less

  19. Transforming Teacher Education, An Activity Theory Analysis

    ERIC Educational Resources Information Center

    McNicholl, Jane; Blake, Allan

    2013-01-01

    This paper explores the work of teacher education in England and Scotland. It seeks to locate this work within conflicting sociocultural views of professional practice and academic work. Drawing on an activity theory framework that integrates the analysis of these seemingly contradictory discourses with a study of teacher educators' practical…

  20. Computer-automated neutron activation analysis system

    SciTech Connect

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references.

  1. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  2. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  3. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  4. Active transporters as enzymes: an energetic framework applied to major facilitator superfamily and ABC importer systems.

    PubMed

    Shilton, Brian H

    2015-04-15

    Active membrane transporters are dynamic molecular machines that catalyse transport across a membrane by coupling solute movement to a source of energy such as ATP or a secondary ion gradient. A central question for many active transporters concerns the mechanism by which transport is coupled to a source of energy. The transport process and associated energetic coupling involve conformational changes in the transporter. For efficient transport, the conformational changes must be tightly regulated and they must link energy use to movement of the substrate across the membrane. The present review discusses active transport using the well-established energetic framework for enzyme-mediated catalysis. In particular, membrane transport systems can be viewed as ensembles consisting of low-energy and high-energy conformations. The transport process involves binding interactions that selectively stabilize the higher energy conformations, and in this way promote conformational changes in the system that are coupled to decreases in free energy and substrate translocation. The major facilitator superfamily of secondary active transporters is used to illustrate these ideas, which are then be expanded to primary active transport mediated by ABC (ATP-binding cassette) import systems, with a focus on the well-studied maltose transporter.

  5. High-Throughput Analysis of Enzyme Activities

    SciTech Connect

    Lu, Guoxin

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  6. Descriptor-Based Analysis Applied to HCN Synthesis from NH3 and CH4

    SciTech Connect

    Grabow, L

    2011-08-18

    The design of solid metal catalysts using theoretical methods has been a long-standing goal in heterogeneous catalysis. Recent developments in methodology and computer technology as well as the establishment of a descriptor-based approach for the analysis of reaction mechanisms and trends across the periodic table allow for the fast screening for new catalytic materials and have lead to first examples of computational discoveries of new materials. The underlying principles of the descriptor-based approach are the existence of relations between the surface electronic structure, adsorption energies and activation barriers that result in volcano-shaped activity plots as function of simple descriptors, such as atomic binding energies or the d-band center. Linear scaling relations have been established between the adsorption energies of hydrogen-containing molecules such as CH{sub x}, NH{sub x}, OH{sub x} and SH{sub x} and the C, N O and S adsorption energies on transition-metal surfaces. Transition-state energies have also been shown to scale linearly with adsorption energies in a similar fashion. Recently, a single transition state scaling relation has been identified for a large number of C-C, C-O, C-N, N-O, N-N, and O-O coupling reactions. The scaling relations provide a powerful tool for the investigation of reaction mechanisms and the prediction of potential energy surfaces. They limit the number of independent variables to a few, typically adsorption energies of key atoms. Using this information as input to a microkinetic model provides an understanding of trends in catalytic activity across the transition metals. In most cases a volcano-shaped relation between activity and the key variables, the descriptors, is observed. In the present paper we will provide an example of the approach outlined above and show how one can obtain an understanding of activity/selectivity trends of a reaction with just a few new calculations.

  7. Quadratic Time-Frequency Analysis of Hydroacoustic Signals as Applied to Acoustic Emissions of Large Whales

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Victor, Sucic; Damir, Malnar; Götz, Bokelmann

    2014-05-01

    In order to enrich the set of attributes in setting up a large database of whale signals, as envisioned in the Baleakanta project, we investigate methods of time-frequency analysis. The purpose of establishing the database is to increase and refine knowledge of the emitted signal and of its propagation characteristics, leading to a better understanding of the animal migrations in a non-invasive manner and to characterize acoustic propagation in oceanic media. The higher resolution for signal extraction and a better separation from other signals and noise will be used for various purposes, including improved signal detection and individual animal identification. The quadratic class of time-frequency distributions (TFDs) is the most popular set of time-frequency tools for analysis and processing of non-stationary signals. Two best known and most studied members of this class are the spectrogram and the Wigner-Ville distribution. However, to be used efficiently, i.e. to have highly concentrated signal components while significantly suppressing interference and noise simultaneously, TFDs need to be optimized first. The optimization method used in this paper is based on the Cross-Wigner-Ville distribution, and unlike similar approaches it does not require prior information on the analysed signal. The method is applied to whale signals, which, just like the majority of other real-life signals, can generally be classified as multicomponent non-stationary signals, and hence time-frequency techniques are a natural choice for their representation, analysis, and processing. We present processed data from a set containing hundreds of individual calls. The TFD optimization method results into a high resolution time-frequency representation of the signals. It allows for a simple extraction of signal components from the TFD's dominant ridges. The local peaks of those ridges can then be used for the signal components instantaneous frequency estimation, which in turn can be used as

  8. Multiobjective optimization in a pseudometric objective space as applied to a general model of business activities

    NASA Astrophysics Data System (ADS)

    Khachaturov, R. V.

    2016-09-01

    It is shown that finding the equivalence set for solving multiobjective discrete optimization problems is advantageous over finding the set of Pareto optimal decisions. An example of a set of key parameters characterizing the economic efficiency of a commercial firm is proposed, and a mathematical model of its activities is constructed. In contrast to the classical problem of finding the maximum profit for any business, this study deals with a multiobjective optimization problem. A method for solving inverse multiobjective problems in a multidimensional pseudometric space is proposed for finding the best project of firm's activities. The solution of a particular problem of this type is presented.

  9. Applying a Wearable Voice-Activated Computer to Instructional Applications in Clean Room Environments

    NASA Technical Reports Server (NTRS)

    Graves, Corey A.; Lupisella, Mark L.

    2004-01-01

    The use of wearable computing technology in restrictive environments related to space applications offers promise in a number of domains. The clean room environment is one such domain in which hands-free, heads-up, wearable computing is particularly attractive for education and training because of the nature of clean room work We have developed and tested a Wearable Voice-Activated Computing (WEVAC) system based on clean room applications. Results of this initial proof-of-concept work indicate that there is a strong potential for WEVAC to enhance clean room activities.

  10. Feed-drug interaction of orally applied butyrate and phenobarbital on hepatic cytochrome P450 activity in chickens.

    PubMed

    Mátis, G; Kulcsár, A; Petrilla, J; Hermándy-Berencz, K; Neogrády, Zs

    2016-08-01

    The expression of hepatic drug-metabolizing cytochrome P450 (CYP) enzymes may be affected by several nutrition-derived compounds, such as by the commonly applied feed additive butyrate, possibly leading to feed-drug interactions. The aim of this study was to provide some evidence if butyrate can alter the activity of hepatic CYPs in chickens exposed to CYP-inducing xenobiotics, monitoring for the first time the possibility of such interaction. Ross 308 chickens in the grower phase were treated with daily intracoelomal phenobarbital (PB) injection (80 mg/kg BW), applied as a non-specific CYP-inducer, simultaneously with two different doses of intra-ingluvial sodium butyrate boluses (0.25 and 1.25 g/kg BW) for 5 days. Activity of CYP2H and CYP3A subfamilies was assessed by specific enzyme assays from isolated liver microsomes. According to our results, the lower dose of orally administered butyrate significantly attenuated the PB-triggered elevation of both hepatic CYP2H and CYP3A activities, which might be in association with the partly common signalling pathways of butyrate and CYP-inducing drugs, such as that of PB. Based on these data, butyrate may take part in pharmacoepigenetic interactions with simultaneously applied drugs or other CYP-inducing xenobiotics, with possible consequences for food safety and pharmacotherapy. Butyrate was found to be capable to maintain physiological CYP activity by attenuating CYP induction, underlining the safety of butyrate application in poultry nutrition.

  11. Applying gap analysis and a comparison index to evaluate protected areas in Thailand.

    PubMed

    Trisurat, Yongyut

    2007-02-01

    Protected areas in Thailand were first established 40 years ago. The total area of existing protected areas covers 18.2% of the country's land area and the Class 1 Watershed, another form of protection, encompasses 18.1%. The government of Thailand intends to increase protected area systems to 25% of the country in 2006 and 30% in 2016. There are always questions arising about how much is enough protected areas to effectively protect biodiversity. The objective of this article is to assess the representation of ecosystems in the protected area network. This article also recommends which underrepresented ecosystems should be added to fill the gaps in representativeness. The research applies a gap analysis and a comparison index to assess the representation of ecosystems within the protected area network. The spatial analyses were applied to measure three aspects of representativeness, namely forest type, altitude, and natural land system. The analyses indicate that the existing protected area system covers 24.4% of the country's land area, nearly meeting the 25% target proposed by the National Forest Policy; and 83.8% of these areas are under forest cover. Most protected areas are situated in high altitudes, where biological diversity is less than in lowlands. Mangrove forest and riparian floodplain are extremely underrepresented in the existing system. Peat swamp forest, dry dipterocarp forest, and beach forest are relatively well represented. In addition, these five ecosystems are threatened by human pressures and natural disasters; therefore, they should be targeted as high priorities for the selection of new reserves. Future research should incorporate aquatic and marine ecosystems, as well as animal distributions, which were not included in this research due to data unavailabilities. PMID:17106794

  12. Structural scene analysis and content-based image retrieval applied to bone age assessment

    NASA Astrophysics Data System (ADS)

    Fischer, Benedikt; Brosig, André; Deserno, Thomas M.; Ott, Bastian; Günther, Rolf W.

    2009-02-01

    Radiological bone age assessment is based on global or local image regions of interest (ROI), such as epiphyseal regions or the area of carpal bones. Usually, these regions are compared to a standardized reference and a score determining the skeletal maturity is calculated. For computer-assisted diagnosis, automatic ROI extraction is done so far by heuristic approaches. In this work, we apply a high-level approach of scene analysis for knowledge-based ROI segmentation. Based on a set of 100 reference images from the IRMA database, a so called structural prototype (SP) is trained. In this graph-based structure, the 14 phalanges and 5 metacarpal bones are represented by nodes, with associated location, shape, as well as texture parameters modeled by Gaussians. Accordingly, the Gaussians describing the relative positions, relative orientation, and other relative parameters between two nodes are associated to the edges. Thereafter, segmentation of a hand radiograph is done in several steps: (i) a multi-scale region merging scheme is applied to extract visually prominent regions; (ii) a graph/sub-graph matching to the SP robustly identifies a subset of the 19 bones; (iii) the SP is registered to the current image for complete scene-reconstruction (iv) the epiphyseal regions are extracted from the reconstructed scene. The evaluation is based on 137 images of Caucasian males from the USC hand atlas. Overall, an error rate of 32% is achieved, for the 6 middle distal and medial/distal epiphyses, 23% of all extractions need adjustments. On average 9.58 of the 14 epiphyseal regions were extracted successfully per image. This is promising for further use in content-based image retrieval (CBIR) and CBIR-based automatic bone age assessment.

  13. Applied mechanics of the Puricelli osteotomy: a linear elastic analysis with the finite element method

    PubMed Central

    Puricelli, Edela; Fonseca, Jun Sérgio Ono; de Paris, Marcel Fasolo; Sant'Anna, Hervandil

    2007-01-01

    Background Surgical orthopedic treatment of the mandible depends on the development of techniques resulting in adequate healing processes. In a new technical and conceptual alternative recently introduced by Puricelli, osteotomy is performed in a more distal region, next to the mental foramen. The method results in an increased area of bone contact, resulting in larger sliding rates among bone segments. This work aimed to investigate the mechanical stability of the Puricelli osteotomy design. Methods Laboratory tests complied with an Applied Mechanics protocol, in which results from the Control group (without osteotomy) were compared with those from Test I (Obwegeser-Dal Pont osteotomy) and Test II (Puricelli osteotomy) groups. Mandible edentulous prototypes were scanned using computerized tomography, and digitalized images were used to build voxel-based finite element models. A new code was developed for solving the voxel-based finite elements equations, using a reconditioned conjugate gradients iterative solver. The Magnitude of Displacement and von Mises equivalent stress fields were compared among the three groups. Results In Test Group I, maximum stress was seen in the region of the rigid internal fixation plate, with value greater than those of Test II and Control groups. In Test Group II, maximum stress was in the same region as in Control group, but was lower. The results of this comparative study using the Finite Element Analysis suggest that Puricelli osteotomy presents better mechanical stability than the original Obwegeser-Dal Pont technique. The increased area of the proximal segment and consequent decrease of the size of lever arm applied to the mandible in the modified technique yielded lower stress values, and consequently greater stability of the bone segments. Conclusion This work showed that Puricelli osteotomy of the mandible results in greater mechanical stability when compared to the original technique introduced by Obwegeser-Dal Pont. The

  14. Life Science Start-up Activities at the Universities of Applied Sciences (UAS).

    PubMed

    Huber, Gerda

    2014-12-01

    The universities of applied sciences (UAS) provide several values for the society and economy of a country. Besides education of high level professionals, transfer of knowledge from research to applications in industry or as new start-up companies is an important task. This is done in different ways in the various disciplines. In Life Sciences, a key industry branch in Switzerland, innovation is a competitive success factor and research findings from UAS/Life Sciences contribute to the valorization of new technologies to products, services and to business performance. In order to foster awareness for the innovation need of industry, UAS install processes and support for transfer of research and technology results to marketable applications. Furthermore they may facilitate contacts of researchers and students with entrepreneurs in order to animate start-up founding as a true alternative to being employed. Access to coaching and entrepreneurial training completes the essential basis. PMID:26508606

  15. Differentiating semantic categories during the acquisition of novel words: correspondence analysis applied to event-related potentials.

    PubMed

    Fargier, Raphaël; Ploux, Sabine; Cheylus, Anne; Reboul, Anne; Paulignan, Yves; Nazir, Tatjana A

    2014-11-01

    Growing evidence suggests that semantic knowledge is represented in distributed neural networks that include modality-specific structures. Here, we examined the processes underlying the acquisition of words from different semantic categories to determine whether the emergence of visual- and action-based categories could be tracked back to their acquisition. For this, we applied correspondence analysis (CA) to ERPs recorded at various moments during acquisition. CA is a multivariate statistical technique typically used to reveal distance relationships between words of a corpus. Applied to ERPs, it allows isolating factors that best explain variations in the data across time and electrodes. Participants were asked to learn new action and visual words by associating novel pseudowords with the execution of hand movements or the observation of visual images. Words were probed before and after training on two consecutive days. To capture processes that unfold during lexical access, CA was applied on the 100-400 msec post-word onset interval. CA isolated two factors that organized the data as a function of test sessions and word categories. Conventional ERP analyses further revealed a category-specific increase in the negativity of the ERPs to action and visual words at the frontal and occipital electrodes, respectively. The distinct neural processes underlying action and visual words can thus be tracked back to the acquisition of word-referent relationships and may have its origin in association learning. Given current evidence for the flexibility of language-induced sensory-motor activity, we argue that these associative links may serve functions beyond word understanding, that is, the elaboration of situation models.

  16. Sunlight-driven copper-catalyst activation applied to photolatent click chemistry.

    PubMed

    Beniazza, Rédouane; Lambert, Romain; Harmand, Lydie; Molton, Florian; Duboc, Carole; Denisov, Sergey; Jonusauskas, Gedeminas; McClenaghan, Nathan D; Lastécouères, Dominique; Vincent, Jean-Marc

    2014-10-01

    The synthesis, full characterization, photoreduction properties, and catalytic activity for the copper(I)-catalyzed alkyne-azide cycloaddition (CuAAC) reaction of a copper(II)-DMEDA (N,N'-dimethylethylendiamine) complex is reported. Spectroscopic studies (UV/Vis, EPR) demonstrated that under daylight illumination highly effective copper(II) to copper(I) reduction occurs in this complex. These findings are in agreement with a high photoreduction quantum yield value of 0.22 in MeOH, and a value approaching unity as determined in THF. The reduction process, which can also be conducted by irradiation at 365 nm by using a standard TLC (thin layer chromatography) lamp, is ascribed to a highly efficient photoinduced electron transfer (PET) process mediated by the benzophenone photosensitizer present in the carboxylate counterion. Having deaerated the reaction mixture, the photogenerated copper(I) species proved to be highly active for the CuAAC reaction, demonstrated by reactions conducted with low catalyst loading (0.5 mol %) on a range of clickable protected and non-protected mono- and disaccharides. Once initiated, the reaction can be stopped at any time on introducing air into the reaction medium. Deoxygenation followed by irradiation restores the activity, making the copper(II)-DMEDA complex a switchable catalyst of practical value. PMID:25171758

  17. Neutron activation analysis of Etruscan pottery

    SciTech Connect

    Whitehead, J.; Silverman, A.; Ouellet, C.G.; Clark, D.D.; Hossain, T.Z

    1992-07-01

    Neutron activation analysis (NAA) has been widely used in archaeology for compositional analysis of pottery samples taken from sites of archaeological importance. Elemental profiles can determine the place of manufacture. At Cornell, samples from an Etruscan site near Siena, Italy, are being studied. The goal of this study is to compile a trace element concentration profile for a large number of samples. These profiles will be matched with an existing data bank in an attempt to understand the place of origin for these samples. The 500 kW TRIGA reactor at the Ward Laboratory is used to collect NAA data for these samples. Experiments were done to set a procedure for the neutron activation analysis with respect to sample preparation, selection of irradiation container, definition of activation and counting parameters and data reduction. Currently, we are able to analyze some 27 elements in samples of mass 500 mg with a single irradiation of 4 hours and two sequences of counting. Our sensitivity for many of the trace elements is better than 1 ppm by weight under the conditions chosen. In this talk, details of our procedure, including quality assurance as measured by NIST standard reference materials, will be discussed. In addition, preliminary results from data treatment using cluster analysis will be presented. (author)

  18. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    NASA Astrophysics Data System (ADS)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  19. Activity Analysis and Cost Analysis in Medical Schools.

    ERIC Educational Resources Information Center

    Koehler, John E.; Slighton, Robert L.

    There is no unique answer to the question of what an ongoing program costs in medical schools. The estimates of program costs generated by classical methods of cost accounting are unsatisfactory because such accounting cannot deal with the joint production or joint cost problem. Activity analysis models aim at calculating the impact of alternative…

  20. Multivariate analysis applied to monthly rainfall over Rio de Janeiro state, Brazil

    NASA Astrophysics Data System (ADS)

    Brito, Thábata T.; Oliveira-Júnior, José F.; Lyra, Gustavo B.; Gois, Givanildo; Zeri, Marcelo

    2016-10-01

    Spatial and temporal patterns of rainfall were identified over the state of Rio de Janeiro, southeast Brazil. The proximity to the coast and the complex topography create great diversity of rainfall over space and time. The dataset consisted of time series (1967-2013) of monthly rainfall over 100 meteorological stations. Clustering analysis made it possible to divide the stations into six groups (G1, G2, G3, G4, G5 and G6) with similar rainfall spatio-temporal patterns. A linear regression model was applied to a time series and a reference. The reference series was calculated from the average rainfall within a group, using nearby stations with higher correlation (Pearson). Based on t-test (p < 0.05) all stations had a linear spatiotemporal trend. According to the clustering analysis, the first group (G1) contains stations located over the coastal lowlands and also over the ocean facing area of Serra do Mar (Sea ridge), a 1500 km long mountain range over the coastal Southeastern Brazil. The second group (G2) contains stations over all the state, from Serra da Mantiqueira (Mantiqueira Mountains) and Costa Verde (Green coast), to the south, up to stations in the Northern parts of the state. Group 3 (G3) contains stations in the highlands over the state (Serrana region), while group 4 (G4) has stations over the northern areas and the continent-facing side of Serra do Mar. The last two groups were formed with stations around Paraíba River (G5) and the metropolitan area of the city of Rio de Janeiro (G6). The driest months in all regions were June, July and August, while November, December and January were the rainiest months. Sharp transitions occurred when considering monthly accumulated rainfall: from January to February, and from February to March, likely associated with episodes of "veranicos", i.e., periods of 4-15 days of duration with no rainfall.

  1. Methods for applying accurate digital PCR analysis on low copy DNA samples.

    PubMed

    Whale, Alexandra S; Cowen, Simon; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Digital PCR (dPCR) is a highly accurate molecular approach, capable of precise measurements, offering a number of unique opportunities. However, in its current format dPCR can be limited by the amount of sample that can be analysed and consequently additional considerations such as performing multiplex reactions or pre-amplification can be considered. This study investigated the impact of duplexing and pre-amplification on dPCR analysis by using three different assays targeting a model template (a portion of the Arabidopsis thaliana alcohol dehydrogenase gene). We also investigated the impact of different template types (linearised plasmid clone and more complex genomic DNA) on measurement precision using dPCR. We were able to demonstrate that duplex dPCR can provide a more precise measurement than uniplex dPCR, while applying pre-amplification or varying template type can significantly decrease the precision of dPCR. Furthermore, we also demonstrate that the pre-amplification step can introduce measurement bias that is not consistent between experiments for a sample or assay and so could not be compensated for during the analysis of this data set. We also describe a model for estimating the prevalence of molecular dropout and identify this as a source of dPCR imprecision. Our data have demonstrated that the precision afforded by dPCR at low sample concentration can exceed that of the same template post pre-amplification thereby negating the need for this additional step. Our findings also highlight the technical differences between different templates types containing the same sequence that must be considered if plasmid DNA is to be used to assess or control for more complex templates like genomic DNA.

  2. Moving Beyond Active-Site Detection: MixMD Applied to Allosteric Systems.

    PubMed

    Ghanakota, Phani; Carlson, Heather A

    2016-08-25

    Mixed-solvent molecular dynamics (MixMD) is a hotspot-mapping technique that relies on molecular dynamics simulations of proteins in binary solvent mixtures. Previous work on MixMD has established the technique's effectiveness in capturing binding sites of small organic compounds. In this work, we show that MixMD can identify both competitive and allosteric sites on proteins. The MixMD approach embraces full protein flexibility and allows competition between solvent probes and water. Sites preferentially mapped by probe molecules are more likely to be binding hotspots. There are two important requirements for the identification of ligand-binding hotspots: (1) hotspots must be mapped at very high signal-to-noise ratio and (2) the hotspots must be mapped by multiple probe types. We have developed our mapping protocol around acetonitrile, isopropanol, and pyrimidine as probe solvents because they allowed us to capture hydrophilic, hydrophobic, hydrogen-bonding, and aromatic interactions. Charged probes were needed for mapping one target, and we introduce them in this work. In order to demonstrate the robust nature and wide applicability of the technique, a combined total of 5 μs of MixMD was applied across several protein targets known to exhibit allosteric modulation. Most notably, all the protein crystal structures used to initiate our simulations had no allosteric ligands bound, so there was no preorganization of the sites to predispose the simulations to find the allosteric hotspots. The protein test cases were ABL Kinase, Androgen Receptor, CHK1 Kinase, Glucokinase, PDK1 Kinase, Farnesyl Pyrophosphate Synthase, and Protein-Tyrosine Phosphatase 1B. The success of the technique is demonstrated by the fact that the top-four sites solely map the competitive and allosteric sites. Lower-ranked sites consistently map other biologically relevant sites, multimerization interfaces, or crystal-packing interfaces. Lastly, we highlight the importance of including protein

  3. Actively Spreading Plate Boundaries and UNCLOS: the Difficulties of Applying Article 76

    NASA Astrophysics Data System (ADS)

    Evans, A.; Carleton, C.; Parson, L.

    2005-12-01

    The process by which the delineation of a legal continental shelf according to the United Nations Convention on the Law of the Sea (UNCLOS) is made for an oceanic island territory beyond 200 nautical miles causes significant confusion for both researchers on the issue and coastal states' representatives trying to apply the provisions within Article 76. The interpretation of the statutory provisions set out in UNCLOS and the Technical and Scientific guidelines laid out by the Commission on the Limits of the Continental Shelf (CLCS) remain, for the most part, difficult to implement. This global summary of oceanic islands and their geological context with respect to the Law of the Sea Convention, and in particular regarding mid-ocean ridge systems, brings to light some of the problems responsible for this confusion and attempts to resolve the uncertainty associated with ridge issues with a practical and equitable delimitation proposal. Paragraph 6 of Article 76 requires a constraint/limit of 350M from the territorial sea baselines on submarine ridges, (one assumes, to ensure that no coastal state claims the entire mid-ocean ridge!). However the criteria set in paragraph 4a(i) or 4a(ii) of Article 76, which provide the means to extend a legal continental shelf beyond 200M based on the identification of a morphological foot of slope, cannot be applied to oceanic island territories because of difficulties in determining its consistent and practical location. Accordingly the advantages and disadvantages of several geophysical data types have been evaluated for their use under paragraph 4 (b) of Article 76 (definition of the foot of the slope on "evidence to the contrary"). Our proposed method of using the central, zero-age locus of the mid oceanic ridge, whether this may be the ridge line, bathymetric maxima or spreading centre, as a representation of the foot of slope (based on evidence to the contrary), results in a simple, fair and consistent way of developing an area

  4. Uncertainty Optimization Applied to the Monte Carlo Analysis of Planetary Entry Trajectories

    NASA Technical Reports Server (NTRS)

    Olds, John; Way, David

    2001-01-01

    Recently, strong evidence of liquid water under the surface of Mars and a meteorite that might contain ancient microbes have renewed interest in Mars exploration. With this renewed interest, NASA plans to send spacecraft to Mars approx. every 26 months. These future spacecraft will return higher-resolution images, make precision landings, engage in longer-ranging surface maneuvers, and even return Martian soil and rock samples to Earth. Future robotic missions and any human missions to Mars will require precise entries to ensure safe landings near science objective and pre-employed assets. Potential sources of water and other interesting geographic features are often located near hazards, such as within craters or along canyon walls. In order for more accurate landings to be made, spacecraft entering the Martian atmosphere need to use lift to actively control the entry. This active guidance results in much smaller landing footprints. Planning for these missions will depend heavily on Monte Carlo analysis. Monte Carlo trajectory simulations have been used with a high degree of success in recent planetary exploration missions. These analyses ascertain the impact of off-nominal conditions during a flight and account for uncertainty. Uncertainties generally stem from limitations in manufacturing tolerances, measurement capabilities, analysis accuracies, and environmental unknowns. Thousands of off-nominal trajectories are simulated by randomly dispersing uncertainty variables and collecting statistics on forecast variables. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecasts outputs. It lacks a mechanism to affect or alter the uncertainties based on the forecast results. If the results are unacceptable, the current practice is to use an iterative, trial

  5. Atomic force microscopy analysis of synthetic membranes applied in release studies

    NASA Astrophysics Data System (ADS)

    Olejnik, Anna; Nowak, Izabela

    2015-11-01

    Synthetic membranes are commonly used in drug release studies and are applied mostly in quality control. They contain pores through which the drug can be diffused directly into the receptor fluid. Investigation of synthetic membranes permits determination of their structure and characterization of their properties. We suggest that the preliminary characterization of the membranes can be relevant to the interpretation of the release results. The aim of this study was to compare eight synthetic membranes by using atomic force microscopy in order to predict and understand their behavior in the release experiments. The results proved that polytetrafluoroethylene membrane was not suitable for the release study of tetrapeptide due to its hydrophobic nature, thickness and the specific structure with high trapezoid shaped blocks. The additional substructures in pores of mixed cellulose esters and nylon membranes detected by AFM influenced the diffusion rate of the active compound. These findings indicate that the selection of the membrane for the release studies should be performed cautiously by taking into consideration the membrane properties and by analyzing them prior the experiment.

  6. Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation

    NASA Astrophysics Data System (ADS)

    Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.

    2012-12-01

    The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T < 230 K the particles initiate deposition ice nucleation well below the homogeneous freezing limit. Particles collected throughout one day for similar meteorological conditions show very similar

  7. From Nose to Brain: Un-Sensed Electrical Currents Applied in the Nose Alter Activity in Deep Brain Structures

    PubMed Central

    Weiss, Tali; Shushan, Sagit; Ravia, Aharon; Hahamy, Avital; Secundo, Lavi; Weissbrod, Aharon; Ben-Yakov, Aya; Holtzman, Yael; Cohen-Atsmoni, Smadar; Roth, Yehudah; Sobel, Noam

    2016-01-01

    Rules linking patterns of olfactory receptor neuron activation in the nose to activity patterns in the brain and ensuing odor perception remain poorly understood. Artificially stimulating olfactory neurons with electrical currents and measuring ensuing perception may uncover these rules. We therefore inserted an electrode into the nose of 50 human volunteers and applied various currents for about an hour in each case. This induced assorted non-olfactory sensations but never once the perception of odor. To validate contact with the olfactory path, we used functional magnetic resonance imaging to measure resting-state brain activity in 18 subjects before and after un-sensed stimulation. We observed stimulation-induced neural decorrelation specifically in primary olfactory cortex, implying contact with the olfactory path. These results suggest that indiscriminate olfactory activation does not equate with odor perception. Moreover, this effort serendipitously uncovered a novel path for minimally invasive brain stimulation through the nose. PMID:27591145

  8. NADPH oxidases promote apoptosis by activating ZNRF1 ubiquitin ligase in neurons treated with an exogenously applied oxidant

    PubMed Central

    Wakatsuki, Shuji; Araki, Toshiyuki

    2016-01-01

    ABSTRACT Reactive oxygen species (ROS) play an important role in causing neuronal death in a number of neurological disorders. We recently reported that ROS serve as a signal to activate neuronal apoptosis and axonal degeneration by activating ZNRF1 (zinc- and RING-finger 1), a ubiquitin ligase that targets AKT for proteasomal degradation in neurons. In the present study, we showed that the NADPH oxidase family of molecules is required for ZNRF1 activation by epidermal growth factor receptor (EGFR)-dependent phosphorylation in response to axonal injury. We herein demonstrate that NADPH oxidases promote apoptosis by activating ZNRF1, even in neurons treated with an exogenously applied oxidant. These results suggest an important role for NADPH oxidase in the initiation/promotion of neuronal degeneration by increasing ROS in close proximity to protein machineries, including those for ZNRF1 and EGFR, thereby promoting neuronal degeneration. PMID:27195063

  9. Confirmatory factor analysis of different versions of the Body Shape Questionnaire applied to Brazilian university students.

    PubMed

    da Silva, Wanderson Roberto; Dias, Juliana Chioda Ribeiro; Maroco, João; Campos, Juliana Alvares Duarte Bonini

    2014-09-01

    This study aimed at evaluating the validity, reliability, and factorial invariance of the complete (34-item) and shortened (8-item and 16-item) versions of the Body Shape Questionnaire (BSQ) when applied to Brazilian university students. A total of 739 female students with a mean age of 20.44 (standard deviation=2.45) years participated. Confirmatory factor analysis was conducted to verify the degree to which the one-factor structure satisfies the proposal for the BSQ's expected structure. Two items of the 34-item version were excluded because they had factor weights (λ)<40. All models had adequate convergent validity (average variance extracted=.43-.58; composite reliability=.85-.97) and internal consistency (α=.85-.97). The 8-item B version was considered the best shortened BSQ version (Akaike information criterion=84.07, Bayes information criterion=157.75, Browne-Cudeck criterion=84.46), with strong invariance for independent samples (Δχ(2)λ(7)=5.06, Δχ(2)Cov(8)=5.11, Δχ(2)Res(16)=19.30). PMID:25010930

  10. Point pattern analysis with spatially varying covariate effects, applied to the study of cerebrovascular deaths.

    PubMed

    Pinto Junior, Jony Arrais; Gamerman, Dani; Paez, Marina Silva; Fonseca Alves, Regina Helena

    2015-03-30

    This article proposes a modeling approach for handling spatial heterogeneity present in the study of the geographical pattern of deaths due to cerebrovascular disease.The framework involvesa point pattern analysis with components exhibiting spatial variation. Preliminary studies indicate that mortality of this disease and the effect of relevant covariates do not exhibit uniform geographic distribution. Our model extends a previously proposed model in the literature that uses spatial and non-spatial variables by allowing for spatial variation of the effect of non-spatial covariates. A number of relative risk indicators are derived by comparing different covariate levels, different geographic locations, or both. The methodology is applied to the study of the geographical death pattern of cerebrovascular deaths in the city of Rio de Janeiro. The results compare well against existing alternatives, including fixed covariate effects. Our model is able to capture and highlight important data information that would not be noticed otherwise, providing information that is required for appropriate health decision-making.

  11. Optical microsystem for analysis of diffuse reflectance and fluorescence signals applied to early gastrointestinal cancer detection.

    PubMed

    Pimenta, Sara; Castanheira, Elisabete M S; Minas, Graça

    2015-01-30

    The detection of cancer at its earliest stage is crucial in order to increase the probability of a successful treatment. Optical techniques, specifically diffuse reflectance and fluorescence, may considerably improve the ability to detect pre-cancerous lesions. These techniques have high sensitivity to some biomarkers present on the tissues, providing morphological and biochemical information of normal and diseased tissue. The development of a chip sized spectroscopy microsystem, based on these techniques, will greatly improve the early diagnosis of gastrointestinal cancers. The main innovation is the detection of the spectroscopic signals using only few, but representative, spectral bands allowing for miniaturization. This paper presents the mathematical models, its validation and analysis for retrieving data of the measured spectroscopic signals. These models were applied to a set of phantoms clearly representative of gastrointestinal tissues, leading to a more accurate diagnostic by a pathologist. Moreover, it was demonstrated that the models can use the reconstructed spectroscopic signals based only on its extraction on those specific spectral bands. As a result, the viability of the spectroscopy microsystem implementation was proved.

  12. Enhancing DInSAR capabilities for landslide monitoring by applying GIS-based multicriteria filtering analysis

    NASA Astrophysics Data System (ADS)

    Beyene, F.; Knospe, S.; Busch, W.

    2015-04-01

    Landslide detection and monitoring remain difficult with conventional differential radar interferometry (DInSAR) because most pixels of radar interferograms around landslides are affected by different error sources. These are mainly related to the nature of high radar viewing angles and related spatial distortions (such as overlays and shadows), temporal decorrelations owing to vegetation cover, and speed and direction of target sliding masses. On the other hand, GIS can be used to integrate spatial datasets obtained from many sources (including radar and non-radar sources). In this paper, a GRID data model is proposed to integrate deformation data derived from DInSAR processing with other radar origin data (coherence, layover and shadow, slope and aspect, local incidence angle) and external datasets collected from field study of landslide sites and other sources (geology, geomorphology, hydrology). After coordinate transformation and merging of data, candidate landslide representing pixels of high quality radar signals were filtered out by applying a GIS based multicriteria filtering analysis (GIS-MCFA), which excludes grid points in areas of shadow and overlay, low coherence, non-detectable and non-landslide deformations, and other possible sources of errors from the DInSAR data processing. At the end, the results obtained from GIS-MCFA have been verified by using the external datasets (existing landslide sites collected from fieldworks, geological and geomorphologic maps, rainfall data etc.).

  13. Spatial analysis techniques applied to uranium prospecting in Chihuahua State, Mexico

    NASA Astrophysics Data System (ADS)

    Hinojosa de la Garza, Octavio R.; Montero Cabrera, María Elena; Sanín, Luz H.; Reyes Cortés, Manuel; Martínez Meyer, Enrique

    2014-07-01

    To estimate the distribution of uranium minerals in Chihuahua, the advanced statistical model "Maximun Entropy Method" (MaxEnt) was applied. A distinguishing feature of this method is that it can fit more complex models in case of small datasets (x and y data), as is the location of uranium ores in the State of Chihuahua. For georeferencing uranium ores, a database from the United States Geological Survey and workgroup of experts in Mexico was used. The main contribution of this paper is the proposal of maximum entropy techniques to obtain the mineral's potential distribution. For this model were used 24 environmental layers like topography, gravimetry, climate (worldclim), soil properties and others that were useful to project the uranium's distribution across the study area. For the validation of the places predicted by the model, comparisons were done with other research of the Mexican Service of Geological Survey, with direct exploration of specific areas and by talks with former exploration workers of the enterprise "Uranio de Mexico". Results. New uranium areas predicted by the model were validated, finding some relationship between the model predictions and geological faults. Conclusions. Modeling by spatial analysis provides additional information to the energy and mineral resources sectors.

  14. THE EVOLUTION OF THE JOURNAL OF APPLIED ORAL SCIENCE: A BIBLIOMETRIC ANALYSIS

    PubMed Central

    Ferraz, Valéria Cristina Trindade; Amadei, José Roberto Plácido; Santos, Carlos Ferreira

    2008-01-01

    The purpose of this study was to make a brief diagnosis of the evolution of the Journal of Applied Oral Science (JAOS) between 2005 and 2007, by reviewing quantitative and qualitative aspects of the articles published in the JAOS within this period. All articles published in the JAOS in the time span established for this survey were analyzed retrospectively and a discussion was undertaken on the data referring to the main bibliometric indexes of production, authorship, bibliographic sources of the published articles, and the most frequently cited scientific journals in the main dental research fields. A total of 247 papers authored and coauthored by 1,139 contributors were reviewed, most of them being original research articles. The number of authors per article was 4.61 on the average. Regarding the geographic distribution, the authors represented almost all of the Brazilian States. Most published articles belonged to the following dental research fields: Endodontics, Restorative Dentistry, Dental Materials and Prosthodontics. The ranking of the most frequently cited scientific journals included the most reputable publications in these dental research fields. In conclusion, between 2005 and 2007, the JAOS either maintained or improved considerably its bibliometric indexes. The analysis of the data retrieved in this study allowed evaluating the journal's current management strategies, and identifying important issues that will help outlining the future directions for the internationalization of this journal. PMID:19082402

  15. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Hanson, J. M.; Beard, B. B.

    2010-01-01

    This Technical Publication (TP) is meant to address a number of topics related to the application of Monte Carlo simulation to launch vehicle design and requirements analysis. Although the focus is on a launch vehicle application, the methods may be applied to other complex systems as well. The TP is organized so that all the important topics are covered in the main text, and detailed derivations are in the appendices. The TP first introduces Monte Carlo simulation and the major topics to be discussed, including discussion of the input distributions for Monte Carlo runs, testing the simulation, how many runs are necessary for verification of requirements, what to do if results are desired for events that happen only rarely, and postprocessing, including analyzing any failed runs, examples of useful output products, and statistical information for generating desired results from the output data. Topics in the appendices include some tables for requirements verification, derivation of the number of runs required and generation of output probabilistic data with consumer risk included, derivation of launch vehicle models to include possible variations of assembled vehicles, minimization of a consumable to achieve a two-dimensional statistical result, recontact probability during staging, ensuring duplicated Monte Carlo random variations, and importance sampling.

  16. The evolution of the Journal of Applied Oral Science: a bibliometric analysis.

    PubMed

    Ferraz, Valéria Cristina Trindade; Amadei, José Roberto Plácido; Santos, Carlos Ferreira

    2008-01-01

    The purpose of this study was to make a brief diagnosis of the evolution of the Journal of Applied Oral Science (JAOS) between 2005 and 2007, by reviewing quantitative and qualitative aspects of the articles published in the JAOS within this period. All articles published in the JAOS in the time span established for this survey were analyzed retrospectively and a discussion was undertaken on the data referring to the main bibliometric indexes of production, authorship, bibliographic sources of the published articles, and the most frequently cited scientific journals in the main dental research fields. A total of 247 papers authored and co-authored by 1,139 contributors were reviewed, most of them being original research articles. The number of authors per article was 4.61 on the average. Regarding the geographic distribution, the authors represented almost all of the Brazilian States. Most published articles belonged to the following dental research fields: Endodontics, Restorative Dentistry, Dental Materials and Prosthodontics. The ranking of the most frequently cited scientific journals included the most reputable publications in these dental research fields. In conclusion, between 2005 and 2007, the JAOS either maintained or improved considerably its bibliometric indexes. The analysis of the data retrieved in this study allowed evaluating the journal's current management strategies, and identifying important issues that will help outlining the future directions for the internationalization of this journal. PMID:19082402

  17. The Subarray MVDR Beamformer: A Space-Time Adaptive Processor Applied to Active Sonar

    NASA Astrophysics Data System (ADS)

    Bezanson, Leverett Guidroz

    The research for this thesis was mainly performed at the NATO Underwater Research Center, now named the Center for Maritime Research and Experimentation (CMRE). The purpose of the research was to improve the detection of underwater targets in the littoral ocean when using active sonar. Currently these detections are being made by towed line arrays using a delay and sum beamformer for bearing measurements and noise suppression. This method of beamforming has can suffer from reverberation that commonly is present in the littoral environment. A proposed solution is to use an adaptive beamformer which can attenuate reverberation and increase the bearing resolution. The adaptive beamforming algorithms have existed for a long time and typically are not used in the active case due to limited amount of observable data that is needed for adaptation. This deficiency is caused by the conflicting requirements for high Doppler resolution for target detection and small time windows for building up full-rank covariance estimates. The algorithms also are sensitive to bearing estimate errors that commonly occur in active sonar systems. Recently it has been proposed to overcome these limitations through the use of reduced beamspace adaptive beamforming. The Subarray MVDR beamformer is analyzed, both against simulated data and against experimental data collected by CMRE during the GLINT/NGAS11 experiment in 2011. Simulation results indicate that the Subarray MVDR beamformer rejects interfering signals that are not effectively attenuated by conventional beamforming. The application of the Subarray MVDR beamformer to the experimental data shows that the Doppler spread of the reverberation ridge is reduced, and the bearing resolution improved. The signal to noise ratio is calculated at the target location and also shows improvement. These calculated and observed performance metrics indicate an improvement of detection in reverberation noise.

  18. Geometric Nonlinear Finite Element Analysis of Active Fibre Composite Bimorphs

    NASA Astrophysics Data System (ADS)

    Kernaghan, Robert

    Active fibre composite-actuated bimorphic actuators were studied in order to measure deflection performance. The deflection of the actuators was a function of the actuating electric potential applied to the active material as well as the magnitude of the axial preload applied to the bimorphic structure. This problem required the use of geometric nonlinear modeling techniques. Geometric nonlinear finite element analysis was undertaken to determine the deflection performance of Macro Fibre Composite (MFC)- and Hollow Active Fibre (HAFC)-actuated bimorphic structures. A physical prototype MFC-actuated bimorphic structure was manufactured in order to verify the results obtained by the finite element analysis. Theses analyses determined that the bimorphic actuators were capable of significant deflection. The analyses determined that the axial preload of the bimorphic actuators significantly amplified the deflection performance of the bimorphic actuators. The deflection performance of the bimorphic actuators suggest that they could be candidates to act as actuators for the morphing wing of a micro unmanned air vehicle.

  19. Active vibration-suppression systems applied to twin-tail buffeting

    NASA Astrophysics Data System (ADS)

    Hopkins, Mark A.; Henderson, Douglas A.; Moses, Robert W.; Ryall, Thomas G.; Zimcik, David G.; Spangler, Ronald L., Jr.

    1998-06-01

    Buffeting is an aeroelastic phenomenon that plagues high performance aircraft, especially those with twin vertical tails. Unsteady cortices emanate form wing/fuselage leading edge extensions when these aircraft maneuver at high angles of attack. These aircraft are designed such that the vortices shed while maneuvering at high angels of attack and improve the lift-to-drag ratio of the aircraft. With proper placement and sizing of the vertical tails, this improvement may be maintained without adverse effects to the tails. However, there are tail locations and angels of attack where these vortices burst and immerse the vertical tails in their wake inducing severe structural vibrations. The resulting buffet loads and severe vertical tail response because an airframe life and maintenance concern as life cycle costs increased. Several passive methods have been investigated to reduce the buffeting of these vertical tails with limited success. As demonstrated through analyses, wind-tunnel investigations, and full-scale ground tests, active control system offer a promising solution to alleviate buffet induced strain and increase the fatigue life of vertical tails. A collaborative research project including the US, Canada, and Australia is in place to demonstrate active buffet load alleviation systems on military aircraft. The present paper provides details on this collaborative project and other research efforts to reduce the buffeting response of vertical tails in fighter aircraft.

  20. Overview of electromagnetic methods applied in active volcanic areas of western United States

    NASA Astrophysics Data System (ADS)

    Skokan, Catherine K.

    1993-06-01

    A better understanding of active volcanic areas in the United States through electromagnetic geophysical studies received foundation from the many surveys done for geothermal exploration in the 1970's. Investigations by governmental, industrial, and academic agencies include (but are not limited to) mapping of the Cascades. Long Valley/Mono area, the Jemez volcanic field, Yellowstone Park, and an area in Colorado. For one example — Mt. Konocti in the Mayacamas Mountains, California — gravity, magnetic, and seismic, as well as electromagnetic methods have all been used in an attempt to gain a better understanding of the subsurface structure. In each of these volcanic regions, anomalous zones were mapped. When conductive, these anomalies were interpreted to be correlated with hydrothermal activity and not to represent a magma chamber. Electrical and electromagnetic geophysical methods can offer valuable information in the understanding of volcanoes by being the method which is most sensitive to change in temperature and, therefore, can best map heat budget and hydrological character to aid in prediction of eruptions.

  1. Decision Analysis Science Modeling for Application and Fielding Selection Applied to Metal Decontamination Technologies

    SciTech Connect

    Lagos, L.E.; Ebadian, M.A.

    1998-01-01

    During the decontamination and decommissioning (D and D) activities being conducted by the U.S. Department of Energy (DOE), approximately 550,000 metric tons of contaminated metal will be generated by the disposition of contaminated buildings. The majority of the structural steel is considered to be radiologically contaminated. The D and D activities require the treatment of the structural steel to reduce occupational and environmental radiological exposures during dismantlement. Treatment technologies may also be required for possible recycling. Many proven commercial treatment technologies are available. These treatment processes vary in aggressiveness, safety requirements, secondary waste generation, necessary capital, and operation and maintenance costs. Choosing the appropriate technology to meet the decontamination objectives for structural steel is a difficult process. A single information source comparing innovative and nuclear and non-nuclear technologies in the areas of safety, cost and effectiveness is not currently commercially available to perform a detailed analysis. This study presents comparable data related to operation and maintenance, cost, and health and safely aspects of three readily available technologies and one innovative technology for nuclear decontamination. The technologies include Advance Recyclable Media System (ARMS{trademark}), NELCO Porta Shot Blast{trademark} (JHJ-2000), Pegasus Coating Removal System 7 (PCRS-7) and the innovative laser ablation technology called the Yag Eraser{trademark}.

  2. Applying Antonio Gramsci's philosophy to postcolonial feminist social and political activism in nursing.

    PubMed

    Racine, Louise

    2009-07-01

    Through its social and political activism goals, postcolonial feminist theoretical approaches not only focus on individual issues that affect health but encompass the examination of the complex interplay between neocolonialism, neoliberalism, and globalization, in mediating the health of non-Western immigrants and refugees. Postcolonial feminism holds the promise to influence nursing research and practice in the 21st century where health remains a goal to achieve and a commitment for humanity. This is especially relevant for nurses, who act as global citizens and as voices for the voiceless. The commitment of nursing to social justice must be further strengthened by relying on postcolonial theories to address issues of health inequities that arise from marginalization and racialization. In using postcolonial feminist theories, nurse researchers locate the inquiry process within a Gramscian philosophy of praxis that represents knowledge in action. PMID:19527439

  3. Osteogenic Activity of Locally Applied Small Molecule Drugs in a Rat Femur Defect Model

    PubMed Central

    Cottrell, Jessica A.; Vales, Francis M.; Schachter, Deborah; Wadsworth, Scott; Gundlapalli, Rama; Kapadia, Rasesh; O'Connor, J. Patrick

    2010-01-01

    The long-term success of arthroplastic joints is dependent on the stabilization of the implant within the skeletal site. Movement of the arthroplastic implant within the bone can stimulate osteolysis, and therefore methods which promote rigid fixation or bone growth are expected to enhance implant stability and the long-term success of joint arthroplasty. In the present study, we used a simple bilateral bone defect model to analyze the osteogenic activity of three small-molecule drug implants via microcomputerized tomography (micro-CT) and histomorphometry. In this study, we show that local delivery of alendronate, but not lovastatin or omeprazole, led to significant new bone formation at the defect site. Since alendronate impedes osteoclast-development, it is theorized that alendronate treatment results in a net increase in bone formation by preventing osteoclast mediated remodeling of the newly formed bone and upregulating osteoblasts. PMID:20625499

  4. Intelligent error correction method applied on an active pixel sensor based star tracker

    NASA Astrophysics Data System (ADS)

    Schmidt, Uwe

    2005-10-01

    Star trackers are opto-electronic sensors used on-board of satellites for the autonomous inertial attitude determination. During the last years star trackers became more and more important in the field of the attitude and orbit control system (AOCS) sensors. High performance star trackers are based up today on charge coupled device (CCD) optical camera heads. The active pixel sensor (APS) technology, introduced in the early 90-ties, allows now the beneficial replacement of CCD detectors by APS detectors with respect to performance, reliability, power, mass and cost. The company's heritage in star tracker design started in the early 80-ties with the launch of the worldwide first fully autonomous star tracker system ASTRO1 to the Russian MIR space station. Jena-Optronik recently developed an active pixel sensor based autonomous star tracker "ASTRO APS" as successor of the CCD based star tracker product series ASTRO1, ASTRO5, ASTRO10 and ASTRO15. Key features of the APS detector technology are, a true xy-address random access, the multiple windowing read out and the on-chip signal processing including the analogue to digital conversion. These features can be used for robust star tracking at high slew rates and under worse conditions like stray light and solar flare induced single event upsets. A special algorithm have been developed to manage the typical APS detector error contributors like fixed pattern noise (FPN), dark signal non-uniformity (DSNU) and white spots. The algorithm works fully autonomous and adapts to e.g. increasing DSNU and up-coming white spots automatically without ground maintenance or re-calibration. In contrast to conventional correction methods the described algorithm does not need calibration data memory like full image sized calibration data sets. The application of the presented algorithm managing the typical APS detector error contributors is a key element for the design of star trackers for long term satellite applications like

  5. Solar load ratio method applied to commercial building active solar system sizing

    SciTech Connect

    Schnurr, N.M.; Hunn, B.D.; Williamson, K.D. III

    1981-01-01

    The hourly simulation procedure is the DOE-2 building energy analysis computer program. It is capable of calculating the loads and of simulating various control strategies in detail for both residential and commercial buildings and yet is computationally efficient enough to be used for extensive parametric studies. In addition, to a Building Service Hot Water (BSHW) System and a combined space heating and hot water system using liquid collectors for a commercial building analyzed previously, a space heating system using an air collector is analyzed. A series of runs is made for systems using evacuated tube collectors for comparison to flat-plate collectors, and the effects of additional system design parameters are investigated. Also, the generic collector types are characterized by standard efficiency curves, rather than by detailed collector specifications. (MHR)

  6. Lessons learned from applied field research activities in Africa during the malaria eradication era

    PubMed Central

    Bruce-Chwatt, Leonard J.

    1984-01-01

    The Malaria Conference in Equatorial Africa, convened by the World Health Organization in 1950 in Kampala, Uganda, was a milestone in the history of modern malaria control activities on the continent of Africa. It presented and assessed the available international information on epidemiological aspects of this disease and attempted to coordinate the various methods of research and control of malaria. Its two main recommendations were that malaria should be controlled by all available methods, irrespective of the degree of endemicity of the disease, and that the benefits that malaria control might bring to the indigenous population should be evaluated. The first period of field research and pilot control projects in Africa was between 1950 and 1964. A large number of studies in several African countries showed that the use of residual insecticides such as DDT and HCH might decrease, at times considerably, the amount of malaria transmission, but interruption of transmission could not be achieved, except in two relatively small projects in the forest areas of Cameroon and Liberia. During the second period, from 1965 to 1974, the difficulties of malaria eradication and control in Africa became more evident because of the development of resistance of Anopheles gambiae to DDT, HCH, and dieldrin; moreover administrative, logistic, and financial problems had emerged. It became clear that the prospects for malaria control (let alone those for eradication) were related to the availability of a network of basic health services. A number of “pre-eradication” programmes were set up in order to develop better methods of malaria control and to improve the rural health infrastructures. Much field research on the chemotherapy of malaria was carried out and the value of collective or selective administration of antimalarial drugs was fully recognized, although it became obvious that this could not play an important part in the decrease of transmission of malaria in Africa. The

  7. Single Phase Passive Rectification Versus Active Rectification Applied to High Power Stirling Engines

    NASA Technical Reports Server (NTRS)

    Santiago, Walter; Birchenough, Arthur G.

    2006-01-01

    Stirling engine converters are being considered as potential candidates for high power energy conversion systems required by future NASA explorations missions. These types of engines typically contain two major moving parts, the displacer and the piston, in which a linear alternator is attached to the piston to produce a single phase sinusoidal waveform at a specific electric frequency. Since all Stirling engines perform at low electrical frequencies (less or equal to 100 Hz), space explorations missions that will employ these engines will be required to use DC power management and distribution (PMAD) system instead of an AC PMAD system to save on space and weight. Therefore, to supply such DC power an AC to DC converter is connected to the Stirling engine. There are two types of AC to DC converters that can be employed, a passive full bridge diode rectifier and an active switching full bridge rectifier. Due to the inherent line inductance of the Stirling Engine-Linear Alternator (SE-LA), their sinusoidal voltage and current will be phase shifted producing a power factor below 1. In order to keep power the factor close to unity, both AC to DC converters topologies will implement power factor correction. This paper discusses these power factor correction methods as well as their impact on overall mass for exploration applications. Simulation results on both AC to DC converters topologies with power factor correction as a function of output power and SE-LA line inductance impedance are presented and compared.

  8. Activity-based cost management. Part II: Applied to a respiratory protection program.

    PubMed

    Brandt, M T; Levine, S P; Smith, D G; Ettinger, H J; Gallimore, B F

    1998-05-01

    To demonstrate the relevance of activity-based cost management (ABCM) for the occupational and environmental health community, the investigators used data generated by an ABCM model of a respiratory protection program (RPP) to develop options for solving a business problem. The RPP manager in this hypothetical but realistic business scenario is faced with a 25% budget cut and a 10% increase in demand for RPP services. The manager's dilemma is to maintain the integrity of the RPP while absorbing a significant budget cut. Various cost savings options are developed, and the assumptions under which these options operate are presented. It is emphasized that the RPP manager's primary responsibility is to assure worker health and safety by first understanding the technical issues, merits, and implications of any cost-cutting option that may be considered. It is argued that only then should the manager consider the financial merits of the possible solutions to this business problem. In this way worker health and safety, and environmental protection goals, can continue to be achieved in an economic climate of cost cutting and downsizing. PMID:9622907

  9. Experimental Studies of Active and Passive Flow Control Techniques Applied in a Twin Air-Intake

    PubMed Central

    Joshi, Shrey; Jindal, Aman; Maurya, Shivam P.; Jain, Anuj

    2013-01-01

    The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ) and a vane-type passive vortex generator (VG) and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel) and counterrotating (V-shape) are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG. PMID:23935422

  10. Applying quantitative structure-activity relationship approaches to nanotoxicology: current status and future potential.

    PubMed

    Winkler, David A; Mombelli, Enrico; Pietroiusti, Antonio; Tran, Lang; Worth, Andrew; Fadeel, Bengt; McCall, Maxine J

    2013-11-01

    The potential (eco)toxicological hazard posed by engineered nanoparticles is a major scientific and societal concern since several industrial sectors (e.g. electronics, biomedicine, and cosmetics) are exploiting the innovative properties of nanostructures resulting in their large-scale production. Many consumer products contain nanomaterials and, given their complex life-cycle, it is essential to anticipate their (eco)toxicological properties in a fast and inexpensive way in order to mitigate adverse effects on human health and the environment. In this context, the application of the structure-toxicity paradigm to nanomaterials represents a promising approach. Indeed, according to this paradigm, it is possible to predict toxicological effects induced by chemicals on the basis of their structural similarity with chemicals for which toxicological endpoints have been previously measured. These structure-toxicity relationships can be quantitative or qualitative in nature and they can predict toxicological effects directly from the physicochemical properties of the entities (e.g. nanoparticles) of interest. Therefore, this approach can aid in prioritizing resources in toxicological investigations while reducing the ethical and monetary costs that are related to animal testing. The purpose of this review is to provide a summary of recent key advances in the field of QSAR modelling of nanomaterial toxicity, to identify the major gaps in research required to accelerate the use of quantitative structure-activity relationship (QSAR) methods, and to provide a roadmap for future research needed to achieve QSAR models useful for regulatory purposes. PMID:23165187

  11. Experimental studies of active and passive flow control techniques applied in a twin air-intake.

    PubMed

    Paul, Akshoy Ranjan; Joshi, Shrey; Jindal, Aman; Maurya, Shivam P; Jain, Anuj

    2013-01-01

    The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ) and a vane-type passive vortex generator (VG) and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel) and counterrotating (V-shape) are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG.

  12. Applying quantitative structure-activity relationship approaches to nanotoxicology: current status and future potential.

    PubMed

    Winkler, David A; Mombelli, Enrico; Pietroiusti, Antonio; Tran, Lang; Worth, Andrew; Fadeel, Bengt; McCall, Maxine J

    2013-11-01

    The potential (eco)toxicological hazard posed by engineered nanoparticles is a major scientific and societal concern since several industrial sectors (e.g. electronics, biomedicine, and cosmetics) are exploiting the innovative properties of nanostructures resulting in their large-scale production. Many consumer products contain nanomaterials and, given their complex life-cycle, it is essential to anticipate their (eco)toxicological properties in a fast and inexpensive way in order to mitigate adverse effects on human health and the environment. In this context, the application of the structure-toxicity paradigm to nanomaterials represents a promising approach. Indeed, according to this paradigm, it is possible to predict toxicological effects induced by chemicals on the basis of their structural similarity with chemicals for which toxicological endpoints have been previously measured. These structure-toxicity relationships can be quantitative or qualitative in nature and they can predict toxicological effects directly from the physicochemical properties of the entities (e.g. nanoparticles) of interest. Therefore, this approach can aid in prioritizing resources in toxicological investigations while reducing the ethical and monetary costs that are related to animal testing. The purpose of this review is to provide a summary of recent key advances in the field of QSAR modelling of nanomaterial toxicity, to identify the major gaps in research required to accelerate the use of quantitative structure-activity relationship (QSAR) methods, and to provide a roadmap for future research needed to achieve QSAR models useful for regulatory purposes.

  13. Probabilistic latent semantic analysis applied to whole bacterial genomes identifies common genomic features.

    PubMed

    Rusakovica, Julija; Hallinan, Jennifer; Wipat, Anil; Zuliani, Paolo

    2014-01-01

    The spread of drug resistance amongst clinically-important bacteria is a serious, and growing, problem [1]. However, the analysis of entire genomes requires considerable computational effort, usually including the assembly of the genome and subsequent identification of genes known to be important in pathology. An alternative approach is to use computational algorithms to identify genomic differences between pathogenic and non-pathogenic bacteria, even without knowing the biological meaning of those differences. To overcome this problem, a range of techniques for dimensionality reduction have been developed. One such approach is known as latent-variable models [2]. In latent-variable models dimensionality reduction is achieved by representing a high-dimensional data by a few hidden or latent variables, which are not directly observed but inferred from the observed variables present in the model. Probabilistic Latent Semantic Indexing (PLSA) is an extention of LSA [3]. PLSA is based on a mixture decomposition derived from a latent class model. The main objective of the algorithm, as in LSA, is to represent high-dimensional co-occurrence information in a lower-dimensional way in order to discover the hidden semantic structure of the data using a probabilistic framework. In this work we applied the PLSA approach to analyse the common genomic features in methicillin resistant Staphylococcus aureus, using tokens derived from amino acid sequences rather than DNA. We characterised genome-scale amino acid sequences in terms of their components, and then investigated the relationships between genomes and tokens and the phenotypes they generated. As a control we used the non-pathogenic model Gram-positive bacterium Bacillus subtilis. PMID:24980693

  14. Quasi-dynamic Material Flow Analysis applied to the Austrian Phosphorus cycle

    NASA Astrophysics Data System (ADS)

    Zoboli, Ottavia; Rechberger, Helmut

    2013-04-01

    Phosphorus (P) is one of the key elements that sustain life on earth and that allow achieving the current high levels of food production worldwide. It is a non-renewable resource, without any existing substitute. Because of its current dissipative use by mankind and to its very slow geochemical cycle, this resource is rapidly depleting and it is strongly connected to the problem of ensuring food security. Moreover P is also associated to important environmental problems. Its extraction often generates hazardous wastes, while its accumulation in water bodies can lead to eutrophication, with consequent severe ecological damages. It is therefore necessary to analyze and understand in detail the system of P, in regard to its use and management, to identify the processes that should be targeted in order to reduce the overall consumption of this resource. This work aims at establishing a generic quasi-dynamic model, which describes the Austrian P-budget and which allows investigating the trends of P use in the past, but also selected future scenarios. Given the importance of P throughout the whole anthropogenic metabolism, the model is based on a comprehensive system that encompasses several economic sectors, from agriculture and animal husbandry to industry, consumption and waste and wastewater treatment. Furthermore it includes the hydrosphere, to assess the losses of P into water bodies, due to the importance of eutrophication problems. The methodology applied is Material Flow Analysis (MFA), which is a systemic approach to assess and balance the stocks and flows of a material within a system defined in space and time. Moreover the model is integrated in the software STAN, a freeware tailor-made for MFA. Particular attention is paid to the characteristics and the quality of the data, in order to include data uncertainty and error propagation in the dynamic balance.

  15. Probabilistic latent semantic analysis applied to whole bacterial genomes identifies common genomic features.

    PubMed

    Rusakovica, Julija; Hallinan, Jennifer; Wipat, Anil; Zuliani, Paolo

    2014-06-30

    The spread of drug resistance amongst clinically-important bacteria is a serious, and growing, problem [1]. However, the analysis of entire genomes requires considerable computational effort, usually including the assembly of the genome and subsequent identification of genes known to be important in pathology. An alternative approach is to use computational algorithms to identify genomic differences between pathogenic and non-pathogenic bacteria, even without knowing the biological meaning of those differences. To overcome this problem, a range of techniques for dimensionality reduction have been developed. One such approach is known as latent-variable models [2]. In latent-variable models dimensionality reduction is achieved by representing a high-dimensional data by a few hidden or latent variables, which are not directly observed but inferred from the observed variables present in the model. Probabilistic Latent Semantic Indexing (PLSA) is an extention of LSA [3]. PLSA is based on a mixture decomposition derived from a latent class model. The main objective of the algorithm, as in LSA, is to represent high-dimensional co-occurrence information in a lower-dimensional way in order to discover the hidden semantic structure of the data using a probabilistic framework. In this work we applied the PLSA approach to analyse the common genomic features in methicillin resistant Staphylococcus aureus, using tokens derived from amino acid sequences rather than DNA. We characterised genome-scale amino acid sequences in terms of their components, and then investigated the relationships between genomes and tokens and the phenotypes they generated. As a control we used the non-pathogenic model Gram-positive bacterium Bacillus subtilis.

  16. Uncertainty propagation analysis applied to volcanic ash dispersal at Mt. Etna by using a Lagrangian model

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Pardini, Federica; Spanu, Antonio; Neri, Augusto; Vittoria Salvetti, Maria

    2015-04-01

    Volcanic ash clouds represent a major hazard for populations living nearby volcanic centers producing a risk for humans and a potential threat to crops, ground infrastructures, and aviation traffic. Lagrangian particle dispersal models are commonly used for tracking ash particles emitted from volcanic plumes and transported under the action of atmospheric wind fields. In this work, we present the results of an uncertainty propagation analysis applied to volcanic ash dispersal from weak plumes with specific focus on the uncertainties related to the grain-size distribution of the mixture. To this aim, the Eulerian fully compressible mesoscale non-hydrostatic model WRF was used to generate the driving wind, representative of the atmospheric conditions occurring during the event of November 24, 2006 at Mt. Etna. Then, the Lagrangian particle model LPAC (de' Michieli Vitturi et al., JGR 2010) was used to simulate the transport of mass particles under the action of atmospheric conditions. The particle motion equations were derived by expressing the Lagrangian particle acceleration as the sum of the forces acting along its trajectory, with drag forces calculated as a function of particle diameter, density, shape and Reynolds number. The simulations were representative of weak plume events of Mt. Etna and aimed to quantify the effect on the dispersal process of the uncertainty in the particle sphericity and in the mean and variance of a log-normal distribution function describing the grain-size of ash particles released from the eruptive column. In order to analyze the sensitivity of particle dispersal to these uncertain parameters with a reasonable number of simulations, and therefore with affordable computational costs, response surfaces in the parameter space were built by using the generalized polynomial chaos technique. The uncertainty analysis allowed to quantify the most probable values, as well as their pdf, of the number of particles as well as of the mean and

  17. Valuation of OSA process and folic acid addition as excess sludge minimization alternatives applied in the activated sludge process.

    PubMed

    Martins, C L; Velho, V F; Ramos, S R A; Pires, A S C D; Duarte, E C N F A; Costa, R H R

    2016-01-01

    The aim of this study was to investigate the ability of the oxic-settling-anaerobic (OSA)-process and the folic acid addition applied in the activated sludge process to reduce the excess sludge production. The study was monitored during two distinct periods: activated sludge system with OSA-process, and activated sludge system with folic acid addition. The observed sludge yields (Yobs) were 0.30 and 0.08 kgTSS kg(-1) chemical oxygen demand (COD), control phase and OSA-process (period 1); 0.33 and 0.18 kgTSS kg(-1) COD, control phase and folic acid addition (period 2). The Yobs decreased by 73 and 45% in phases with the OSA-process and folic acid addition, respectively, compared with the control phases. The sludge minimization alternatives result in a decrease in excess sludge production, without negatively affecting the performance of the effluent treatment. PMID:26901714

  18. [Effects of applying different kind fertilizers on enzyme activities related to carbon, nitrogen, and phosphorus cycles in reddish paddy soil].

    PubMed

    Xu, Li-Li; Wang, Qiu-Bing; Zhang, Xin-Yu; Sun, Xiao-Min; Dai, Xiao-Qin; Yang, Feng-Ting; Bu, Jin-Feng; Wang, Hui-min

    2013-04-01

    Based on the long-term fixed position experimental data from Qianyanzhou Ecological Experiment Station, Chinese Academy of Sciences in 1998, this paper analyzed the effects of applying different kind fertilizers (straw, ST; pig manure, OM; and chemical fertilizer, NPK) on the nutrients (C, N, and P) status and the activities of related enzymes ( beta-1,4-glucosidase, betaG; beta-1,4-N-acetylglucosaminidase, NAG; L-leucine aminopeptidase, LAP; and acid phosphatase, AP) in reddish paddy soil. With the application of OM, the activities of soil betaG, NAG, and LAP increased significantly, as compared with other treatments, and were 1.4, 2. 6, and 1.9 times higher than the control (CK) , respectively. Applying OM also improved the ratio of soil organic carbon to total nitrogen (C/N), but decreased the soil betaG/(NAG+LAP) ratio, suggesting that pig manure could benefit the degradation of soil cellulose and the accumulation of soil organic carbon. Applying NPK increased the activities of soil betaG, NAG, and LAP, but decreased the AP activity, with a decrement of 34% as compared with CK. Under the application of NPK, the soilbetaG/AP and (NAG+ LAP)/AP ratios increased, but the ratios of soil organic carbon to total phosphorus (C/P) and of soil total nitrogen to total phosphorus (N/P) decreased, indicating that chemical fertilizers could induce the accumulation of soil inorganic phosphorus, and inhibit the microbial functions of degrading polysaccharides and phosphate phospholipids.

  19. Research on long-range laser active imaging system applied in adverse weather conditions

    NASA Astrophysics Data System (ADS)

    Gai, Zhi-gang; Liu, Meng-de; Yang, Li; Kabanov, V. V.; Shi, Lei; Zhao, Jie; Chu, Shi-bo; Yang, Jun-xian; Zhou, Yang

    2013-09-01

    A low-light level night vision device or thermal infrared imager belonging to passive imaging system is generally used in daily target detection and identification. But in adverse weather conditions of dark of night, poor atmospheric transmission characteristics or strong backscattering (fog, dust, rain, snow, etc.), even the most sensitive low-light level night vision could not provide enough image resolution for detecting and identifying targets, and the thermal infrared imager is also limited by low temperature contrast. A long-range laser active imaging system, in combination with high-power semiconductor pulsed lasers with collimation technology, receiving objective lens of large diameter, long focal length and narrow viewing angle, high-gain image intensifier CCD (ICCD) camera and range-gated synchronization control technology, is developed for long distance target detection and high resolution imaging in adverse weather conditions. The system composition and operating principle are introduced. The extremely powerful and efficient illuminators with collimation technology are able to deliver uniform beams, which are essential for illuminating targets at a distance and generating high-quality images. The particular receiving objective lens, ICCD camera and range-gated synchronization control technology could reduce strong backscattering signal and improve imaging signal-to-noise ratio. The laboratory and outfield experiments have been done to validate imaging effect and imaging quality. The results show that the minimum resolution is about 3-5cm, 10cm, and greater than 20 cm for target far from 1100m, 4700m, and 6700m respectively in dark of night. Furthermore, the minimum resolution could reach to 10cm and 20cm for target far from 2500m and 4800m respectively and the image is too blurred to accurately identify the target when observing the target far from 7200m in rainy condition.

  20. A Pilot Study for Applying an Extravehicular Activity Exercise Prebreathe Protocol to the International Space Station

    NASA Technical Reports Server (NTRS)

    Woodruff, Kristin K.; Johnson, Anyika N.; Lee, Stuart M. C.; Gernhardt, Michael; Schneider, Suzanne M.; Foster, Philip P.

    2000-01-01

    Decompression sickness (DCS) is a serious risk to astronauts performing extravehicular activity (EVA). To reduce this risk, the addition of ten minutes of moderate exercise (75% VO2pk) during prebreathe has been shown to decrease the total prebreathe time from 4 to 2 hours and to decrease the incidence of DCS. The overall purpose of this pilot study was to develop an exercise protocol using flight hardware and an in-flight physical fitness cycle test to perform prebreathe exercise before an EVA. Eleven subjects volunteered to participate in this study. The first objective of this study was to compare the steady-state heart rate (HR) and oxygen consumption (VO2) from a submaximal arm and leg exercise (ALE) session with those predicted from a maximal ALE test. The second objective was to compare the steady-state HR and V02 from a submaximal elastic tube and leg exercise (TLE) session with those predicted from the maximal ALE test. The third objective involved a comparison of the maximal ALE test with a maximal leg-only (LE) test to conform to the in- flight fitness assessment test. The 75% VO2pk target HR from the LE test was significantly less than the target HR from the ALE test. Prescribing exercise using data from the maximal ALE test resulted in the measured submaximal values being higher than predicted VO2 and HR. The results of this pilot study suggest that elastic tubing is valid during EVA prebreathe as a method of arm exercise with the flight leg ergometer and it is recommended that prebreathe countermeasure exercise protocol incorporate this method.

  1. Further Insight and Additional Inference Methods for Polynomial Regression Applied to the Analysis of Congruence

    ERIC Educational Resources Information Center

    Cohen, Ayala; Nahum-Shani, Inbal; Doveh, Etti

    2010-01-01

    In their seminal paper, Edwards and Parry (1993) presented the polynomial regression as a better alternative to applying difference score in the study of congruence. Although this method is increasingly applied in congruence research, its complexity relative to other methods for assessing congruence (e.g., difference score methods) was one of the…

  2. An Analysis of Applied Mechanics Contest for Senior High School Students in Taiwan

    ERIC Educational Resources Information Center

    Chen, Nelson Cheng-Chih; Chen, Ching-Hao; Lin, Ming-chun

    2008-01-01

    The applied mechanics education contest hosted by STAM (Society of Theoretical Applied Mechanics) has been held in Taiwan for several years. The contest pattern has been changed from a simple written test to an experiment-oriented test after the NSTM (National Science and Technology Museum) proceeded to hold the competition in 2005. The major…

  3. Towards an Analysis of Review Article in Applied Linguistics: Its Classes, Purposes and Characteristics

    ERIC Educational Resources Information Center

    Azar, Ali Sorayyaei; Hashim, Azirah

    2014-01-01

    The classes, purposes and characteristics associated with the review article in the field of applied linguistics were analyzed. The data were collected from a randomly selected corpus of thirty two review articles from a discipline-related key journal in applied linguistics. The findings revealed that different sub-genres can be identified within…

  4. Mustiscaling Analysis applied to field Water Content through Distributed Fiber Optic Temperature sensing measurements

    NASA Astrophysics Data System (ADS)

    Benitez Buelga, Javier; Rodriguez-Sinobas, Leonor; Sanchez, Raul; Gil, Maria; Tarquis, Ana M.

    2014-05-01

    signal variation, or to see at which scales signals are most correlated. This can give us an insight into the dominant processes An alternative to both of the above methods has been described recently. Relative entropy and increments in relative entropy has been applied in soil images (Bird et al., 2006) and in soil transect data (Tarquis et al., 2008) to study scale effects localized in scale and provide the information that is complementary to the information about scale dependencies found across a range of scales. We will use them in this work to describe the spatial scaling properties of a set of field water content data measured in an extension of a corn field, in a plot of 500 m2 and an spatial resolution of 25 cm. These measurements are based on an optics cable (BruggSteal) buried on a ziz-zag deployment at 30cm depth. References Bird, N., M.C. Díaz, A. Saa, and A.M. Tarquis. 2006. A review of fractal and multifractal analysis of soil pore-scale images. J. Hydrol. 322:211-219. Kravchenko, A.N., R. Omonode, G.A. Bollero, and D.G. Bullock. 2002. Quantitative mapping of soil drainage classes using topographical data and soil electrical conductivity. Soil Sci. Soc. Am. J. 66:235-243. Lark, R.M., A.E. Milne, T.M. Addiscott, K.W.T. Goulding, C.P. Webster, and S. O'Flaherty. 2004. Scale- and location-dependent correlation of nitrous oxide emissions with soil properties: An analysis using wavelets. Eur. J. Soil Sci. 55:611-627. Lark, R.M., S.R. Kaffka, and D.L. Corwin. 2003. Multiresolution analysis of data on electrical conductivity of soil using wavelets. J. Hydrol. 272:276-290. Lark, R. M. and Webster, R. 1999. Analysis and elucidation of soil variation using wavelets. European J. of Soil Science, 50(2): 185-206. Mandelbrot, B.B. 1982. The fractal geometry of nature. W.H. Freeman, New York. Percival, D.B., and A.T. Walden. 2000. Wavelet methods for time series analysis. Cambridge Univ. Press, Cambridge, UK. Tarquis, A.M., N.R. Bird, A.P. Whitmore, M.C. Cartagena, and

  5. Mustiscaling Analysis applied to field Water Content through Distributed Fiber Optic Temperature sensing measurements

    NASA Astrophysics Data System (ADS)

    Benitez Buelga, Javier; Rodriguez-Sinobas, Leonor; Sanchez, Raul; Gil, Maria; Tarquis, Ana M.

    2014-05-01

    signal variation, or to see at which scales signals are most correlated. This can give us an insight into the dominant processes An alternative to both of the above methods has been described recently. Relative entropy and increments in relative entropy has been applied in soil images (Bird et al., 2006) and in soil transect data (Tarquis et al., 2008) to study scale effects localized in scale and provide the information that is complementary to the information about scale dependencies found across a range of scales. We will use them in this work to describe the spatial scaling properties of a set of field water content data measured in an extension of a corn field, in a plot of 500 m2 and an spatial resolution of 25 cm. These measurements are based on an optics cable (BruggSteal) buried on a ziz-zag deployment at 30cm depth. References Bird, N., M.C. Díaz, A. Saa, and A.M. Tarquis. 2006. A review of fractal and multifractal analysis of soil pore-scale images. J. Hydrol. 322:211-219. Kravchenko, A.N., R. Omonode, G.A. Bollero, and D.G. Bullock. 2002. Quantitative mapping of soil drainage classes using topographical data and soil electrical conductivity. Soil Sci. Soc. Am. J. 66:235-243. Lark, R.M., A.E. Milne, T.M. Addiscott, K.W.T. Goulding, C.P. Webster, and S. O'Flaherty. 2004. Scale- and location-dependent correlation of nitrous oxide emissions with soil properties: An analysis using wavelets. Eur. J. Soil Sci. 55:611-627. Lark, R.M., S.R. Kaffka, and D.L. Corwin. 2003. Multiresolution analysis of data on electrical conductivity of soil using wavelets. J. Hydrol. 272:276-290. Lark, R. M. and Webster, R. 1999. Analysis and elucidation of soil variation using wavelets. European J. of Soil Science, 50(2): 185-206. Mandelbrot, B.B. 1982. The fractal geometry of nature. W.H. Freeman, New York. Percival, D.B., and A.T. Walden. 2000. Wavelet methods for time series analysis. Cambridge Univ. Press, Cambridge, UK. Tarquis, A.M., N.R. Bird, A.P. Whitmore, M.C. Cartagena, and

  6. Uncertainty optimization applied to the Monte Carlo analysis of planetary entry trajectories

    NASA Astrophysics Data System (ADS)

    Way, David Wesley

    2001-10-01

    Future robotic missions to Mars, as well as any human missions, will require precise entries to ensure safe landings near science objectives and pre-deployed assets. Planning for these missions will depend heavily on Monte Carlo analyses to evaluate active guidance algorithms, assess the impact of off-nominal conditions, and account for uncertainty. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecast output statistics. An improvement to the Monte Carlo analysis is needed that will allow the problem to be worked in reverse. In this way, the largest allowable dispersions that achieve the required mission objectives can be determined quantitatively. This thesis proposes a methodology to optimize the uncertainties in the Monte Carlo analysis of spacecraft landing footprints. A metamodel is used to first write polynomial expressions for the size of the landing footprint as functions of the independent uncertainty extrema. The coefficients of the metamodel are determined by performing experiments. The metamodel is then used in a constrained optimization procedure to minimize a cost-tolerance function. First, a two-dimensional proof-of-concept problem was used to evaluate the feasibility of this optimization method. Next, the optimization method was further demonstrated on the Mars Surveyor Program 2001 Lander. The purpose of this example was to demonstrate that the methodology developed during the proof-of-concept could be scaled to solve larger, more complicated, "real world" problems. This research has shown that is possible to control the size of the landing footprint and establish tolerances for mission uncertainties. A simplified metamodel was developed, which is enabling for realistic problems with more than just a few uncertainties. A confidence interval on

  7. Comparative Sensitivity Analysis of Muscle Activation Dynamics.

    PubMed

    Rockenfeller, Robert; Günther, Michael; Schmitt, Syn; Götz, Thomas

    2015-01-01

    We mathematically compared two models of mammalian striated muscle activation dynamics proposed by Hatze and Zajac. Both models are representative for a broad variety of biomechanical models formulated as ordinary differential equations (ODEs). These models incorporate parameters that directly represent known physiological properties. Other parameters have been introduced to reproduce empirical observations. We used sensitivity analysis to investigate the influence of model parameters on the ODE solutions. In addition, we expanded an existing approach to treating initial conditions as parameters and to calculating second-order sensitivities. Furthermore, we used a global sensitivity analysis approach to include finite ranges of parameter values. Hence, a theoretician striving for model reduction could use the method for identifying particularly low sensitivities to detect superfluous parameters. An experimenter could use it for identifying particularly high sensitivities to improve parameter estimation. Hatze's nonlinear model incorporates some parameters to which activation dynamics is clearly more sensitive than to any parameter in Zajac's linear model. Other than Zajac's model, Hatze's model can, however, reproduce measured shifts in optimal muscle length with varied muscle activity. Accordingly we extracted a specific parameter set for Hatze's model that combines best with a particular muscle force-length relation. PMID:26417379

  8. Neutron activation analysis in archaeological chemistry

    SciTech Connect

    Harbottle, G.

    1987-01-01

    Neutron activation analysis has proven to be a convenient way of performing the chemical analysis of archaeologically-excavated artifacts and materials. It is fast and does not require tedious laboratory operations. It is multielement, sensitive, and can be made nondestructive. Neutron activation analysis in its instrumental form, i.e., involving no chemical separation, is ideally suited to automation and conveniently takes the first step in data flow patterns that are appropriate for many taxonomic and statistical operations. The future will doubtless see improvements in the practice of NAA in general, but in connection with archaeological science the greatest change will be the filling, interchange and widespread use of data banks based on compilations of analytical data. Since provenience-oriented data banks deal with materials (obsidian, ceramics, metals, semiprecious stones, building materials and sculptural media) that participated in trade networks, the analytical data is certain to be of interest to a rather broad group of archaeologists. It is to meet the needs of the whole archaeological community that archaeological chemistry must now turn.

  9. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    the inversion procedure a genetical algorithm (GA) was used. Specific features such as elitism, roulette-wheel process for selection operator and island theory were implemented. Optimization was based on the water content measurements recorded at several depths. Ten scenarios have been elaborated and applied on the two lysimeters in order to investigate the impact of the conceptual model in terms of processes description (mechanistic or compartmental) and geometry (number of horizons in the profile description) on the calibration accuracy. Calibration leads to a good agreement with the measured water contents. The most critical parameters for improving the goodness of fit are the number of horizons and the type of process description. Best fit are found for a mechanistic model with 5 horizons resulting in absolute differences between observed and simulated water contents less than 0.02 cm3cm-3 in average. Parameter estimate analysis shows that layers thicknesses are poorly constrained whereas hydraulic parameters are much well defined.

  10. Multiple response optimization applied to the development of a capillary electrophoretic method for pharmaceutical analysis.

    PubMed

    Candioti, Luciana Vera; Robles, Juan C; Mantovani, Víctor E; Goicoechea, Héctor C

    2006-03-15

    Multiple response simultaneous optimization by using the desirability function was used for the development of a capillary electrophoresis method for the simultaneous determination of four active ingredients in pharmaceutical preparations: vitamins B(6) and B(12), dexamethasone and lidocaine hydrochloride. Five responses were simultaneously optimized: the three resolutions, the analysis time and the capillary current. This latter response was taken into account in order to improve the quality of the separations. The separation was carried out by using capillary zone electrophoresis (CZE) with a silica capillary and UV detection (240 nm). The optimum conditions were: 57.0 mmol l(-1) sodium phosphate buffer solution, pH 7.0 and voltage=17.2 kV. Good results concerning precision (CV lower than 2%), accuracy (recoveries ranged between 98.5 and 102.6%) and selectivity were obtained in the concentration range studied for the four compounds. These results are comparable to those provided by the reference high performance liquid chromatography (HPLC) technique.

  11. Transition path theory analysis of c-Src kinase activation.

    PubMed

    Meng, Yilin; Shukla, Diwakar; Pande, Vijay S; Roux, Benoît

    2016-08-16

    Nonreceptor tyrosine kinases of the Src family are large multidomain allosteric proteins that are crucial to cellular signaling pathways. In a previous study, we generated a Markov state model (MSM) to simulate the activation of c-Src catalytic domain, used as a prototypical tyrosine kinase. The long-time kinetics of transition predicted by the MSM was in agreement with experimental observations. In the present study, we apply the framework of transition path theory (TPT) to the previously constructed MSM to characterize the main features of the activation pathway. The analysis indicates that the activating transition, in which the activation loop first opens up followed by an inward rotation of the αC-helix, takes place via a dense set of intermediate microstates distributed within a fairly broad "transition tube" in a multidimensional conformational subspace connecting the two end-point conformations. Multiple microstates with negligible equilibrium probabilities carry a large transition flux associated with the activating transition, which explains why extensive conformational sampling is necessary to accurately determine the kinetics of activation. Our results suggest that the combination of MSM with TPT provides an effective framework to represent conformational transitions in complex biomolecular systems. PMID:27482115

  12. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-05-01

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures.

  13. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study.

    PubMed

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-05-15

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures.

  14. Analysis of DOE international environmental management activities

    SciTech Connect

    Ragaini, R.C.

    1995-09-01

    The Department of Energy`s (DOE) Strategic Plan (April 1994) states that DOE`s long-term vision includes world leadership in environmental restoration and waste management activities. The activities of the DOE Office of Environmental Management (EM) can play a key role in DOE`s goals of maintaining U.S. global competitiveness and ensuring the continuation of a world class science and technology community. DOE`s interest in attaining these goals stems partly from its participation in organizations like the Trade Policy Coordinating Committee (TPCC), with its National Environmental Export Promotion Strategy, which seeks to strengthen U.S. competitiveness and the building of public-private partnerships as part of U.S. industrial policy. The International Interactions Field Office task will build a communication network which will facilitate the efficient and effective communication between DOE Headquarters, Field Offices, and contractors. Under this network, Headquarters will provide the Field Offices with information on the Administration`s policies and activities (such as the DOE Strategic Plan), interagency activities, as well as relevant information from other field offices. Lawrence Livermore National Laboratory (LLNL) will, in turn, provide Headquarters with information on various international activities which, when appropriate, will be included in reports to groups like the TPCC and the EM Focus Areas. This task provides for the collection, review, and analysis of information on the more significant international environmental restoration and waste management initiatives and activities which have been used or are being considered at LLNL. Information gathering will focus on efforts and accomplishments in meeting the challenges of providing timely and cost effective cleanup of its environmentally damaged sites and facilities, especially through international technical exchanges and/or the implementation of foreign-development technologies.

  15. Toward a Technology of Derived Stimulus Relations: An Analysis of Articles Published in the "Journal of Applied Behavior Analysis," 1992-2009

    ERIC Educational Resources Information Center

    Rehfeldt, Ruth Anne

    2011-01-01

    Every article on stimulus equivalence or derived stimulus relations published in the "Journal of Applied Behavior Analysis" was evaluated in terms of characteristics that are relevant to the development of applied technologies: the type of participants, settings, procedure automated vs. tabletop), stimuli, and stimulus sensory modality; types of…

  16. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    PubMed

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-01

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  17. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    PubMed

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-01

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  18. An analysis of the New Technical Strategy Flowsheet applied to the watch list tanks

    SciTech Connect

    Booker, C.P.

    1994-09-01

    The New Technical Strategy Flowsheet is currently chosen for cleanup of the tanks on the Hanford Reservation. In this study, it is applied to a set of nine single shell tanks on the site. These tanks are considered to have a high potential for uncontrolled releases and have been placed on a watch list. Accordingly, it appears that any waste remediation strategy such as the New Technical Strategy Flowsheet might first be applied to these tanks.

  19. Scalable histopathological image analysis via active learning.

    PubMed

    Zhu, Yan; Zhang, Shaoting; Liu, Wei; Metaxas, Dimitris N

    2014-01-01

    Training an effective and scalable system for medical image analysis usually requires a large amount of labeled data, which incurs a tremendous annotation burden for pathologists. Recent progress in active learning can alleviate this issue, leading to a great reduction on the labeling cost without sacrificing the predicting accuracy too much. However, most existing active learning methods disregard the "structured information" that may exist in medical images (e.g., data from individual patients), and make a simplifying assumption that unlabeled data is independently and identically distributed. Both may not be suitable for real-world medical images. In this paper, we propose a novel batch-mode active learning method which explores and leverages such structured information in annotations of medical images to enforce diversity among the selected data, therefore maximizing the information gain. We formulate the active learning problem as an adaptive submodular function maximization problem subject to a partition matroid constraint, and further present an efficient greedy algorithm to achieve a good solution with a theoretically proven bound. We demonstrate the efficacy of our algorithm on thousands of histopathological images of breast microscopic tissues. PMID:25320821

  20. A response-restriction analysis of stereotypy in adolescents with mental retardation: implications for applied behavior analysis.

    PubMed Central

    McEntee, J E; Saunders, R R

    1997-01-01

    The behavior of 4 adolescents with severe or profound mental retardation was evaluated in the presence of four sets of materials during periods of unstructured leisure activity. Functional engagement with the materials, stereotypic engagement with the materials, stereotypy without interaction with the materials, and other aberrant behaviors were recorded. Across a series of experimental conditions, the number of sets of materials was reduced from four to one by eliminating the set most frequently manipulated in each preceeding condition. In the final condition, four sets of materials were again made available for manipulation. The procedures replicated Green and Striefel's (1988) response-restriction analysis of the activity preferences and play behaviors of children with autism. In general, the results of the present experiment replicate those of Green and Striefel in that reallocation of responding was idiosyncratic and unpredictable as sets of materials were removed. Nevertheless, the results provided insight into how responding might be reallocated if it were restricted through behavioral interventions rather than by restriction of access. Thus, the results are discussed with respect to how response-restriction analyses may be useful in identifying topographies of behavior that could be included in differential reinforcement contigencies that are designed to affect stereotypic behavior and in the selection and arrangement of environmental stimuli to minimize the presence of evokers of stereotypy. PMID:9316261

  1. Comparative analysis of several sediment transport formulations applied to dam-break flows over erodible beds

    NASA Astrophysics Data System (ADS)

    Cea, Luis; Bladé, Ernest; Corestein, Georgina; Fraga, Ignacio; Espinal, Marc; Puertas, Jerónimo

    2014-05-01

    Transitory flows generated by dam failures have a great sediment transport capacity, which induces important morphological changes on the river topography. Several studies have been published regarding the coupling between the sediment transport and hydrodynamic equations in dam-break applications, in order to correctly model their mutual interaction. Most of these models solve the depth-averaged shallow water equations to compute the water depth and velocity. On the other hand, a wide variety of sediment transport formulations have been arbitrarily used to compute the topography evolution. These are based on semi-empirical equations which have been calibrated under stationary and uniform conditions very different from those achieved in dam-break flows. Soares-Frazao et al. (2012) proposed a Benchmark test consisting of a dam-break over a mobile bed, in which several teams of modellers participated using different numerical models, and concluded that the key issue which still needs to be investigated in morphological modelling of dam-break flows is the link between the solid transport and the hydrodynamic variables. This paper presents a comparative analysis of different sediment transport formulations applied to dam-break flows over mobile beds. All the formulations analysed are commonly used in morphological studies in rivers, and include the formulas of Meyer-Peter & Müller (1948), Wong-Parker (2003), Einstein-Brown (1950), van Rijn (1984), Engelund-Hansen (1967), Ackers-White (1973), Yang (1973), and a Meyer-Peter & Müller type formula but with ad-hoc coefficients. The relevance of corrections on the sediment flux direction and magnitude due to the bed slope and the non-equilibrium hypothesis is also analysed. All the formulations have been implemented in the numerical model Iber (Bladé et al. (2014)), which solves the depth-averaged shallow water equations coupled to the Exner equation to evaluate the bed evolution. Two different test cases have been

  2. Activities of the Japan Society of Applied Physics Committee for Diversity Promotion in Science and Technology (abstract)

    NASA Astrophysics Data System (ADS)

    Nishitani-Gamo, Mikka

    2009-04-01

    Since 2001, the Japan Society of Applied Physics (JSAP) Committee for Diversity Promotion in Science and Technology has worked to promote gender equality, both within and between academic societies, and in society as a whole. Main activities of the Committee are: (1) organizing symposia and informal meetings during domestic JSAP conferences to stimulate discussion and raise awareness; (2) encouraging young researchers in pursuit of their careers through the newly designed "career-explorer mark;" (3) offering childcare at biannual JSAP conferences; and (4) helping future scientists and engineers prepare to lead the fields of science and technology on a global level with the creation of an educational roadmap. In this presentation, recent activities of the JSAP Committee are introduced and reviewed.

  3. Electrodermal activity analysis during affective haptic elicitation.

    PubMed

    Greco, Alberto; Valenza, Gaetano; Nardelli, Mimma; Bianchi, Matteo; Lanata, Antonio; Scilingo, Enzo Pasquale

    2015-08-01

    This paper investigates how the autonomic nervous system dynamics, quantified through the analysis of the electrodermal activity (EDA), is modulated according to affective haptic stimuli. Specifically, a haptic display able to convey caress-like stimuli is presented to 32 healthy subjects (16 female). Each stimulus is changed according to six combinations of three velocities and two forces levels of two motors stretching a strip of fabric. Subjects were also asked to score each stimulus in terms of arousal (high/low activation) and valence (pleasant/unpleasant), in agreement with the circumplex model of affect. EDA was processed using a deconvolutive method, separating tonic and phasic components. A statistical analysis was performed in order to identify significant differences in EDA features among force and velocity levels, as well as in their valence and arousal scores. Results show that the simulated caress induced by the haptic display significantly affects the EDA. In detail, the phasic component seems to be inversely related to the valence score. This finding is new and promising, since it can be used, e.g., as an additional cue for haptics design. PMID:26737605

  4. The Linguistic and the Contextual in Applied Genre Analysis: The Case of the Company Audit Report

    ERIC Educational Resources Information Center

    Flowerdew, John; Wan, Alina

    2010-01-01

    By means of an analysis of the genre of the audit report, this study highlights the respective roles of linguistic and contextual analysis in genre analysis, if the results are to be of maximum use in ESP course design. On the one hand, based on a corpus of current and authentic written auditors' reports produced in a large international Hong Kong…

  5. Applying the Active Heating Pulse DFOT Method to Drip Irrigation. Characterization of a wetting bulb in drip emitter

    NASA Astrophysics Data System (ADS)

    Benitez-Buelga, J.; Rodriguez-Sinobas, L.; María Gil-Rodríguez, M.; Sayde, C.; Selker, J. S.

    2011-12-01

    The use of Distributed Fiber Optic Temperature Measurement (DFOT) method for estimating temperature variation along a cable of fiber optic has been largely reported in multiple environmental applications. Recently , its usage has been combined with an active heating pulses technique- measurement of the temperature increase when a certain amount of tension is applied to the stainless jacket surrounding the fiber optic cable-in order to estimate soil water content in field and laboratory conditions with great accuracy . Thus, a methodology potentially capable of monitoring spatial variability and accurately estimates soil water content is created. This study presents a direct application of the Active Heated DFOT method for measuring soil water distribution and wetting bulb of a single drip emitter. In order to do so, three concentric helixes of fiber optics were placed in a hexagonal column of Plexiglas of 0.5 m base radius and 0.6 m height. After being filled up with air-dried loamy soil of controlled bulk density, a pressure compensating drip emitter of 2 L/h discharge was placed on top of the soil column. For an irrigation time of 5 hours and 40 min, 21 heating pulses of 2 minutes and 20W/m, were applied. In addition, soil samples after each heat pulse were also collected. Results showed the potential of this method for monitoring soil water behavior during irrigation and also its capability to estimate soil water content with accuracy.

  6. Multi-Level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    ERIC Educational Resources Information Center

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-01-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective,…

  7. Nonlinear analysis of brain activity in magnetic influenced Parkinson patients.

    PubMed

    Anninos, P A; Adamopoulos, A V; Kotini, A; Tsagas, N

    2000-01-01

    Magnetoencephalogram (MEG) recordings were obtained from the brain of patients suffering from Parkinson's disease (PD) using the Superconductive Quantum Interference Device (SQUID). For each patient the magnetic activity was recorded from a total of 64 points of the skull (32 points from each temporal lobe) as defined by a recording reference system, which is based on the 10-20 Electrode Placement System. Some of the recorded points were observed to exhibit abnormal rhythmic activity, characterized by high amplitudes and low frequencies. External magnetic stimulation (EMS) with intensity 1-7.5pT, and frequency the alpha-rhythm of the patient (8-13 Hz) was applied in the left-right temporal, frontal-occipital and vertex (2 minutes over each of the above regions) and the brain magnetic activity was recorded again. The application of the EMS resulted in rapid attenuation of the MEG activity of PD patients. Furthermore, chaotic dynamic methods were used, in order to estimate the correlation dimension D of the reconstructed phase spaces. The estimated values of D, in conjunction with the results derived from the other data analysis methods, strongly support the existence of low dimension chaotic structures in the dynamics of cortical activity of PD patients. In addition, the increased values of D of the MEG after the application of EMS when compared with the corresponding ones obtained from the MEGs prior to the EMS, suggest that the neural dynamics are strongly influenced by the application of EMS. PMID:11154103

  8. Prediction of Radical Scavenging Activities of Anthocyanins Applying Adaptive Neuro-Fuzzy Inference System (ANFIS) with Quantum Chemical Descriptors

    PubMed Central

    Jhin, Changho; Hwang, Keum Taek

    2014-01-01

    Radical scavenging activity of anthocyanins is well known, but only a few studies have been conducted by quantum chemical approach. The adaptive neuro-fuzzy inference system (ANFIS) is an effective technique for solving problems with uncertainty. The purpose of this study was to construct and evaluate quantitative structure-activity relationship (QSAR) models for predicting radical scavenging activities of anthocyanins with good prediction efficiency. ANFIS-applied QSAR models were developed by using quantum chemical descriptors of anthocyanins calculated by semi-empirical PM6 and PM7 methods. Electron affinity (A) and electronegativity (χ) of flavylium cation, and ionization potential (I) of quinoidal base were significantly correlated with radical scavenging activities of anthocyanins. These descriptors were used as independent variables for QSAR models. ANFIS models with two triangular-shaped input fuzzy functions for each independent variable were constructed and optimized by 100 learning epochs. The constructed models using descriptors calculated by both PM6 and PM7 had good prediction efficiency with Q-square of 0.82 and 0.86, respectively. PMID:25153627

  9. Prediction of radical scavenging activities of anthocyanins applying adaptive neuro-fuzzy inference system (ANFIS) with quantum chemical descriptors.

    PubMed

    Jhin, Changho; Hwang, Keum Taek

    2014-01-01

    Radical scavenging activity of anthocyanins is well known, but only a few studies have been conducted by quantum chemical approach. The adaptive neuro-fuzzy inference system (ANFIS) is an effective technique for solving problems with uncertainty. The purpose of this study was to construct and evaluate quantitative structure-activity relationship (QSAR) models for predicting radical scavenging activities of anthocyanins with good prediction efficiency. ANFIS-applied QSAR models were developed by using quantum chemical descriptors of anthocyanins calculated by semi-empirical PM6 and PM7 methods. Electron affinity (A) and electronegativity (χ) of flavylium cation, and ionization potential (I) of quinoidal base were significantly correlated with radical scavenging activities of anthocyanins. These descriptors were used as independent variables for QSAR models. ANFIS models with two triangular-shaped input fuzzy functions for each independent variable were constructed and optimized by 100 learning epochs. The constructed models using descriptors calculated by both PM6 and PM7 had good prediction efficiency with Q-square of 0.82 and 0.86, respectively. PMID:25153627

  10. Addressing Challenging Behaviour in Children with Down Syndrome: The Use of Applied Behaviour Analysis for Assessment and Intervention

    ERIC Educational Resources Information Center

    Feeley, Kathlee M.; Jones, Emily A.

    2006-01-01

    Children with Down syndrome are at an increased risk for engaging in challenging behaviour that may be part of a behavioural phenotype characteristic of Down syndrome. The methodology of applied behaviour analysis has been demonstrated effective with a wide range of challenging behaviours, across various disabilities. Applications to children with…

  11. Preparing Research Leaders for Special Education: A Ph.D. Program with an Emphasis in Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Heward, William L.; And Others

    1995-01-01

    An Ohio State University doctoral program for special education leadership personnel emphasizing applied behavior analysis is described. The program features intensive formal coursework, special topic seminars, participation in three research studies prior to dissertation, supervised college teaching and advising experiences, and a 10-week summer…

  12. Continuous Recording and Interobserver Agreement Algorithms Reported in the "Journal of Applied Behavior Analysis" (1995-2005)

    ERIC Educational Resources Information Center

    Mudford, Oliver C.; Taylor, Sarah Ann; Martin, Neil T.

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the "Journal of Applied Behavior Analysis (JABA)": Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement,…

  13. Parents' Experiences of Applied Behaviour Analysis (ABA)-Based Interventions for Children Diagnosed with Autistic Spectrum Disorder

    ERIC Educational Resources Information Center

    McPhilemy, Catherine; Dillenburger, Karola

    2013-01-01

    Applied behaviour analysis (ABA)-based programmes are endorsed as the gold standard for treatment of children with autistic spectrum disorder (ASD) in most of North America. This is not the case in most of Europe, where instead a non-specified "eclectic" approach is adopted. We explored the social validity of ABA-based interventions with…

  14. The Impact of Size and Specialisation on Universities' Department Performance: A DEA Analysis Applied to Austrian Universities

    ERIC Educational Resources Information Center

    Leitner, Karl-Heinz; Prikoszovits, Julia; Schaffhauser-Linzatti, Michaela; Stowasser, Rainer; Wagner, Karin

    2007-01-01

    This paper explores the performance efficiency of natural and technical science departments at Austrian universities using Data Envelopment Analysis (DEA). We present DEA as an alternative tool for benchmarking and ranking the assignment of decision-making units (organisations and organisational units). The method applies a multiple input and…

  15. An Analysis of Diagram Modification and Construction in Students' Solutions to Applied Calculus Problems

    ERIC Educational Resources Information Center

    Bremigan, Elizabeth George

    2005-01-01

    In the study reported here, I examined the diagrams that mathematically capable high school students produced in solving applied calculus problems in which a diagram was provided in the problem statement. Analyses of the diagrams contained in written solutions to selected free-response problems from the 1996 BC level Advanced Placement Calculus…

  16. Publication Bias in Studies of an Applied Behavior-Analytic Intervention: An Initial Analysis

    ERIC Educational Resources Information Center

    Sham, Elyssa; Smith, Tristram

    2014-01-01

    Publication bias arises when studies with favorable results are more likely to be reported than are studies with null findings. If this bias occurs in studies with single-subject experimental designs (SSEDs) on applied behavior-analytic (ABA) interventions, it could lead to exaggerated estimates of intervention effects. Therefore, we conducted an…

  17. An Analysis of Oppositional Culture Theory Applied to One Suburban Midwestern High School

    ERIC Educational Resources Information Center

    Blackard, Tricia; Puchner, Laurel; Reeves, Alison

    2014-01-01

    This study explored whether and to what extent Ogbu and Fordham's Oppositional Culture Theory applied to African American high school students at one Midwestern suburban high school. Based on multiple interviews with six African American students, the study found support for some aspects of the theory but not for others.

  18. Applying the Bootstrap to Taxometric Analysis: Generating Empirical Sampling Distributions to Help Interpret Results

    ERIC Educational Resources Information Center

    Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati

    2007-01-01

    Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…

  19. Computational analysis of fluid flow within a device for applying biaxial strain to cultured cells.

    PubMed

    Lee, Jason; Baker, Aaron B

    2015-05-01

    In vitro systems for applying mechanical strain to cultured cells are commonly used to investigate cellular mechanotransduction pathways in a variety of cell types. These systems often apply mechanical forces to a flexible membrane on which cells are cultured. A consequence of the motion of the membrane in these systems is the generation of flow and the unintended application of shear stress to the cells. We recently described a flexible system for applying mechanical strain to cultured cells, which uses a linear motor to drive a piston array to create biaxial strain within multiwell culture plates. To better understand the fluidic stresses generated by this system and other systems of this type, we created a computational fluid dynamics model to simulate the flow during the mechanical loading cycle. Alterations in the frequency or maximal strain magnitude led to a linear increase in the average fluid velocity within the well and a nonlinear increase in the shear stress at the culture surface over the ranges tested (0.5-2.0 Hz and 1-10% maximal strain). For all cases, the applied shear stresses were relatively low and on the order of millipascal with a dynamic waveform having a primary and secondary peak in the shear stress over a single mechanical strain cycle. These findings should be considered when interpreting experimental results using these devices, particularly in the case when the cell type used is sensitive to low magnitude, oscillatory shear stresses. PMID:25611013

  20. Improving Data Analysis in Second Language Acquisition by Utilizing Modern Developments in Applied Statistics

    ERIC Educational Resources Information Center

    Larson-Hall, Jenifer; Herrington, Richard

    2010-01-01

    In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…